SmolLM-135M-Instruct-layer-pruned-90M-raw
A Layer-Pruned version of SmolLM-Instruct-135M
- Layers are removed from the top of the model (except for the last layer) in order to reduce the parameter count to approximately 99M.
- Downloads last month
- 135
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.