3 min read
Google's Gemma Scope 2: 110 Petabytes of Interpretability (And One Big Question)
Google just released Gemma Scope 2, an interpretability toolkit for the entire Gemma 3 model family (270M to 27B parameters). The numbers are staggering: 110 petabytes of stored data, over 1 trillion trained parameters, sparse autoencoders and...
Read More