Published May 14, 2025 | Version v3
Data paper Open

Small Singular Values Matter: A Random Matrix Analysis of Transformer Models

Authors/Creators

Description

This work analyzes the spectra of weight matrices in pretrained transformer models to understand the role of outliers. When comparing Random Matrix Theory (RMT) predictions to properties of trained weights, we associate agreement with random noise and deviations with learned structure. Surprisingly, we find that the  RMT
predictions for spectral properties are not only violated for the large singular values, but also for the small ones.  A comparison of the corresponding singular vectors to eigenvectors of the activation covariance matrices shows a substantial overlap in regions that deviate from RMT expectations, indicating that important directions of the data could be encoded in small singular values.  We verify this result by measuring the perplexity increase when removing these singular values from the matrix and find that they indeed encode important information, as their removal leads to higher perplexity increases than removing singular values from the bulk of the spectrum. When fine-tuning, the smallest singular values can even be the third most important decile of the singular value spectrum.  Finally, we introduce a linear random matrix model to explain how singular vectors corresponding to small singular values can carry more information than those corresponding to larger ones. Our results highlight the relevance of small singular values and offer both theoretical and empirical insights, informing the design of SVD-based pruning methods in large language models.

Files

Code_Small_Singular_Values_Matter.zip

Files (4.6 MB)

Name Size Download all
md5:52db56f4e688056d240ddd04aacbe1e2
4.6 MB Preview Download

Additional details

Dates

Created
2025-05