English

These SAEs were trained on the outputs of each of the MLPs in EleutherAI/pythia-70m. We used 8.2 billion tokens from the Pile training set at a context length of 2049. The number of latents is 32,768.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and HF Inference API was unable to determine this model's library.

Dataset used to train EleutherAI/sae-pythia-70m-32k

Collection including EleutherAI/sae-pythia-70m-32k