|
--- |
|
license: mit |
|
datasets: |
|
- fajrikoto/id_liputan6 |
|
language: |
|
- id |
|
metrics: |
|
- rouge |
|
base_model: |
|
- google/pegasus-cnn_dailymail |
|
pipeline_tag: summarization |
|
library_name: transformers |
|
--- |
|
PEGASUS Mini is a fine-tuned version of the PEGASUS model, originally pre-trained on the CNN/Daily Mail dataset. This fine-tuning is specifically tailored for abstractive text summarization of Indonesian news articles using the Liputan6 dataset. |
|
|
|
The model has been trained on a subset of 50,000 samples from the Liputan6 dataset for 3 epochs, with 256 min length input and max length target making it lightweight and efficient while maintaining strong summarization performance. |