# # Model Card for t5_small Summarization Model ## Model Details - **Model Name**: T5-small Summarization Model - **Architecture**: T5-small - **Purpose**: Summarization of news articles from the CNN/DailyMail dataset. ## Training Data - **Dataset**: CNN/DailyMail dataset (version 3.0.0) ## Training Procedure - **Learning Rate**: 2e-5 - **Batch Size**: 4 (per device) - **Epochs**: 3 - **Evaluation**: ROUGE and BLEU scores were used to evaluate the summarization quality. ## How to Use ```python from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("Yeop9690/t5-small-cnn-dailymail-summarization") model = AutoModelForSeq2SeqLM.from_pretrained("Yeop9690/t5-small-cnn-dailymail-summarization") ``` ## Evaluation Test Results: - eval_rouge1: 0.49 - eval_rouge2: 0.30 - eval_rougeL: 0.45 - eval_bleu1: 38.46 - eval_bleu2: 25.00 - eval_bleu4: 15.00 ## Limitations - This model may not perform well on highly technical or domain-specific content. - The summaries may sometimes miss important context or nuances in the original text. ## Ethical Considerations