Update README.md
Browse files
README.md
CHANGED
@@ -18,6 +18,12 @@ tags:
|
|
18 |
Implemented the [BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension
|
19 |
](https://arxiv.org/abs/1910.13461) paper from scratch using `PyTorch` for an abstractive summarization task in Arabic.
|
20 |
|
|
|
|
|
|
|
|
|
|
|
|
|
21 |
## Goal
|
22 |
|
23 |
Reproduce the BART model from scratch to understand its architecture in depth, using the minimum available resources.
|
@@ -41,8 +47,6 @@ The dataset used is the [XL-Sum(Arabic Subset)](https://github.com/csebuetnlp/xl
|
|
41 |
- validation: `4689 rows`.
|
42 |
- test: `4689 rows`.
|
43 |
|
44 |
-
## Summary
|
45 |
-
|
46 |
## Results
|
47 |
|
48 |
| Epoch | Loss(train) | Loss(validation) | Epoch Time (hours) | Training Time (hours) | Device |
|
@@ -54,13 +58,6 @@ The dataset used is the [XL-Sum(Arabic Subset)](https://github.com/csebuetnlp/xl
|
|
54 |
| 5 | 9.01 | 8.92 | 0.22 | 1.1 | 1 x L4OS |
|
55 |
|
56 |
|
57 |
-
|
58 |
-
## Usage
|
59 |
-
|
60 |
-
```python
|
61 |
-
|
62 |
-
```
|
63 |
-
|
64 |
## License
|
65 |
|
66 |
This model is licensed under the `MIT` License.
|
|
|
18 |
Implemented the [BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension
|
19 |
](https://arxiv.org/abs/1910.13461) paper from scratch using `PyTorch` for an abstractive summarization task in Arabic.
|
20 |
|
21 |
+
>[!IMPORTANT]
|
22 |
+
> The model inferenc is not ready, i mean you can't loading it directly from the `Transformers` library.
|
23 |
+
>
|
24 |
+
> As soon as possible i will create an inference API, and integrate the model with the Transformers library.
|
25 |
+
>
|
26 |
+
|
27 |
## Goal
|
28 |
|
29 |
Reproduce the BART model from scratch to understand its architecture in depth, using the minimum available resources.
|
|
|
47 |
- validation: `4689 rows`.
|
48 |
- test: `4689 rows`.
|
49 |
|
|
|
|
|
50 |
## Results
|
51 |
|
52 |
| Epoch | Loss(train) | Loss(validation) | Epoch Time (hours) | Training Time (hours) | Device |
|
|
|
58 |
| 5 | 9.01 | 8.92 | 0.22 | 1.1 | 1 x L4OS |
|
59 |
|
60 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
61 |
## License
|
62 |
|
63 |
This model is licensed under the `MIT` License.
|