Create README.md
Browse files
README.md
ADDED
@@ -0,0 +1,49 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
language: "ar"
|
3 |
+
tags:
|
4 |
+
- text-generation
|
5 |
+
datasets:
|
6 |
+
- APCD
|
7 |
+
widget:
|
8 |
+
- text: "."
|
9 |
+
- text: "عيد بأية حال"
|
10 |
+
- text: "يا قدس"
|
11 |
+
- text: "يا قدس"
|
12 |
+
- text: "ألا ليت"
|
13 |
+
---
|
14 |
+
|
15 |
+
# GPT2-Arabic-Poetry-2023
|
16 |
+
|
17 |
+
## Model description
|
18 |
+
|
19 |
+
Fine-tuned model of Arabic poetry dataset based on aragpt2-medium.
|
20 |
+
|
21 |
+
## Intended uses & limitations
|
22 |
+
|
23 |
+
#### How to use
|
24 |
+
|
25 |
+
Try this [HF Space](https://huggingface.co/spaces/akhooli/poetry).
|
26 |
+
|
27 |
+
#### Limitations and bias
|
28 |
+
|
29 |
+
Both the GPT2-small-arabic (trained on Arabic Wikipedia) and this model have several limitations in terms of coverage and training performance.
|
30 |
+
Use them as demonstrations or proof of concepts but not as production code.
|
31 |
+
|
32 |
+
## Training data
|
33 |
+
|
34 |
+
This pretrained model used poems from several eras with a total of around 1.4M lines (1.25M used for training).
|
35 |
+
The dataset was trained (fine-tuned) based on the [aragpt2-medium](https://huggingface.co/aubmindlab/aragpt2-medium) transformer model.
|
36 |
+
|
37 |
+
## Training procedure
|
38 |
+
|
39 |
+
Training was done using HF Trainer using free GPU on Kaggle.
|
40 |
+
|
41 |
+
## Eval results
|
42 |
+
Final perplexity reached was 52, eval_accuracy = 0.3704, eval_loss = 3.9513
|
43 |
+
|
44 |
+
### BibTeX entry and citation info
|
45 |
+
|
46 |
+
```bibtex
|
47 |
+
@inproceedings{Abed Khooli,
|
48 |
+
year={2023}
|
49 |
+
}
|