Commit
·
570940d
1
Parent(s):
3fa9cbf
Create README.md
Browse files
README.md
ADDED
@@ -0,0 +1,25 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
title: README
|
3 |
+
emoji: 🏃
|
4 |
+
colorFrom: gray
|
5 |
+
colorTo: purple
|
6 |
+
sdk: static
|
7 |
+
pinned: false
|
8 |
+
---
|
9 |
+
|
10 |
+
# Model Description
|
11 |
+
ClinicalDistilBERT was developed by training the [BioDistilBERT-cased](https://huggingface.co/nlpie/bio-distilbert-cased?text=The+goal+of+life+is+%5BMASK%5D.) model in a continual learning fashion for 3 epochs using a total batch size of 192 on the MIMIC-III notes dataset.
|
12 |
+
|
13 |
+
|
14 |
+
# Initialisation
|
15 |
+
We initialise our model with the pre-trained checkpoints of the [BioDistilBERT-cased](https://huggingface.co/nlpie/bio-distilbert-cased?text=The+goal+of+life+is+%5BMASK%5D.) model available on Huggingface.
|
16 |
+
|
17 |
+
# Architecture
|
18 |
+
In this model, the size of the hidden dimension and the embedding layer are both set to 768. The vocabulary size is 28996. The number of transformer layers is 6 and the expansion rate of the feed-forward layer is 4. Overall, this model has around 65 million parameters.
|
19 |
+
|
20 |
+
# Citation
|
21 |
+
If you use this model, please consider citing the following paper:
|
22 |
+
|
23 |
+
```bibtex
|
24 |
+
|
25 |
+
```
|