imvladikon
commited on
Commit
·
ed4bb7e
1
Parent(s):
e49ddbc
Update README.md
Browse files
README.md
CHANGED
@@ -12,4 +12,8 @@ AlephBertGimmel - Modern Hebrew pretrained BERT model with a 128K token vocabula
|
|
12 |
|
13 |
When using AlephBertGimmel, please reference:
|
14 |
|
15 |
-
|
|
|
|
|
|
|
|
|
|
12 |
|
13 |
When using AlephBertGimmel, please reference:
|
14 |
|
15 |
+
```
|
16 |
+
|
17 |
+
Eylon Guetta, Avi Shmidman, Shaltiel Shmidman, Cheyn Shmuel Shmidman, Joshua Guedalia, Moshe Koppel, Dan Bareket, Amit Seker and Reut Tsarfaty, "Large Pre-Trained Models with Extra-Large Vocabularies: A Contrastive Analysis of Hebrew BERT Models and a New One to Outperform Them All", Nov 2022 [http://arxiv.org/abs/2211.15199]
|
18 |
+
|
19 |
+
```
|