YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

alephbertgimmel

AlephBertGimmel - Modern Hebrew pretrained BERT model with a 128K token vocabulary.

NOTE: This model was only trained with sequences of up to 128 tokens.

When using AlephBertGimmel, please reference:

Eylon Guetta, Avi Shmidman, Shaltiel Shmidman, Cheyn Shmuel Shmidman, Joshua Guedalia, Moshe Koppel, Dan Bareket, Amit Seker and Reut Tsarfaty, "Large Pre-Trained Models with Extra-Large Vocabularies: A Contrastive Analysis of Hebrew BERT Models and a New One to Outperform Them All", Nov 2022 arXiv:2211.15199

Downloads last month
600
Safetensors
Model size
78.8M params
Tensor type
I64
·
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.