SAEs for use with the SAELens library
This repository contains the following SAEs:
- 1000001536
- 833335296
- 333336576
- 166670336
- 500002816
- 666669056
Load these SAEs using SAELens as below:
from sae_lens import SAE
sae, cfg_dict, sparsity = SAE.from_pretrained("demiant/sae-gemma-2-2b-multistage-tied-unfrozen-20x-l1-jump-positive-ortho-l1-10-fixed", "<sae_id>")
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
HF Inference deployability: The model has no pipeline_tag.