Commit
·
e5b80b3
1
Parent(s):
1638c07
Update README.md
Browse files
README.md
CHANGED
@@ -5,7 +5,7 @@ TinyBioBERT is a distilled version of the [BioBERT](https://huggingface.co/dmis-
|
|
5 |
This model uses a unique distillation method called ‘transformer-layer distillation’ which is applied on each layer of the student to align the attention maps and the hidden states of the student with those of the teacher.
|
6 |
|
7 |
# Architecture and Initialisation
|
8 |
-
This model uses 4 hidden layers with a hidden dimension size and an embedding size of 768 resulting in a total of 15M parameters. Due to the small hidden dimension size
|
9 |
|
10 |
# Citation
|
11 |
|
|
|
5 |
This model uses a unique distillation method called ‘transformer-layer distillation’ which is applied on each layer of the student to align the attention maps and the hidden states of the student with those of the teacher.
|
6 |
|
7 |
# Architecture and Initialisation
|
8 |
+
This model uses 4 hidden layers with a hidden dimension size and an embedding size of 768 resulting in a total of 15M parameters. Due to the model's small hidden dimension size, it uses random initialisation.
|
9 |
|
10 |
# Citation
|
11 |
|