MiniLM: 6 Layer Version

This is a 6 layer version of microsoft/MiniLM-L12-H384-uncased by keeping only every second layer.

Downloads last month
1,196
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for nreimers/MiniLM-L6-H384-uncased

Finetunes
3 models
Quantizations
1 model

Spaces using nreimers/MiniLM-L6-H384-uncased 8