File size: 792 Bytes
e50b7bd 3b57212 f592f82 12d6d6a |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 |
---
license: mit
datasets:
- EdinburghNLP/xsum
pipeline_tag: summarization
---
# BART Large CNN Text Summarization Model
This model is based on the Facebook BART (Bidirectional and Auto-Regressive Transformers) architecture, specifically the large variant fine-tuned for text summarization tasks. BART is a sequence-to-sequence model introduced by Facebook AI, capable of handling various natural language processing tasks, including summarization.
## Model Details:
- **Architecture**: BART Large CNN
- **Pre-trained model**: BART Large
- **Fine-tuned for**: Text Summarization
- **Fine-tuning dataset**: [xsum]
## Usage:
### Installation:
You can install the necessary libraries using pip:
```bash
pip install transformers
pip datasets
pip evaluate
pip rouge_score
|