Linear Algebra
Core Concepts
- Vector
- Matrix
- Tensor
- Eigenvalue / eigenvector
- SVD (Singular Value Decomposition)
- PCA (Principal Component Analysis)
Applications in Large Models
Embedding
- Word vectors and Token embeddings are fundamentally high-dimensional vectors.
Attention Mechanism
- QKV matrix multiplication
- Core computation in self-attention (dot product)
Transformer Architecture
- Various layers (Linear Layer)
- Residual connections
- Feed-Forward Network
→ All involve matrix operations
Model Parameters
- The entire model's parameter count can be represented using matrices and tensors.
Dimensionality Reduction and Visualization
- Reducing the dimensionality of embedding spaces (t-SNE, UMAP, PCA) for analysis.
贡献者
这篇文章有帮助吗?
最近更新
Involution Hell© 2026 byCommunityunderCC BY-NC-SA 4.0