arthd24/ext_abs_t5small
This model is a fine-tuned version of t5-small on an unknown dataset. It achieves the following results on the evaluation set:
- Train Loss: 3.1939
- Validation Loss: 3.0679
- Train Rouge1: 0.4006
- Train Rouge2: 0.1457
- Train Rougel: 0.2419
- Train Rougelsum: 0.242
- Train Bertscore F1: 0.6168
- Train Gen Len: 240.9263
- Epoch: 4
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
Training results
Train Loss | Validation Loss | Train Rouge1 | Train Rouge2 | Train Rougel | Train Rougelsum | Train Bertscore F1 | Train Gen Len | Epoch |
---|---|---|---|---|---|---|---|---|
3.3096 | 3.1447 | 0.3986 | 0.1493 | 0.2359 | 0.2366 | 0.6148 | 238.5368 | 0 |
3.2719 | 3.1231 | 0.3898 | 0.1414 | 0.2306 | 0.231 | 0.6069 | 239.7053 | 1 |
3.2477 | 3.0994 | 0.3885 | 0.1403 | 0.2293 | 0.2293 | 0.6094 | 241.9368 | 2 |
3.2166 | 3.0848 | 0.3972 | 0.1458 | 0.234 | 0.2339 | 0.6136 | 239.5474 | 3 |
3.1939 | 3.0679 | 0.4006 | 0.1457 | 0.2419 | 0.242 | 0.6168 | 240.9263 | 4 |
Framework versions
- Transformers 4.46.3
- TensorFlow 2.16.1
- Datasets 3.1.0
- Tokenizers 0.20.0
- Downloads last month
- 9
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
Model tree for arthd24/ext_abs_t5small
Base model
google-t5/t5-small