2
This model is a fine-tuned version of google-t5/t5-base on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.4030
- Precision: 0.572
- Recall: 0.5957
- F1: 0.5836
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 32
- eval_batch_size: 32
- seed: 2
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 20
Training results
Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 |
---|---|---|---|---|---|---|
No log | 1.0 | 40 | 0.3278 | 0.4527 | 0.4063 | 0.4283 |
No log | 2.0 | 80 | 0.2886 | 0.53 | 0.4872 | 0.5077 |
No log | 3.0 | 120 | 0.2747 | 0.5576 | 0.5345 | 0.5458 |
No log | 4.0 | 160 | 0.2895 | 0.5717 | 0.5582 | 0.5649 |
No log | 5.0 | 200 | 0.3065 | 0.5518 | 0.5779 | 0.5645 |
No log | 6.0 | 240 | 0.3145 | 0.5391 | 0.5976 | 0.5669 |
No log | 7.0 | 280 | 0.3286 | 0.5508 | 0.5779 | 0.564 |
No log | 8.0 | 320 | 0.3402 | 0.5461 | 0.572 | 0.5588 |
No log | 9.0 | 360 | 0.3532 | 0.538 | 0.5582 | 0.5479 |
No log | 10.0 | 400 | 0.3647 | 0.5747 | 0.5996 | 0.5869 |
No log | 11.0 | 440 | 0.3736 | 0.5562 | 0.5759 | 0.5659 |
No log | 12.0 | 480 | 0.3796 | 0.5599 | 0.5897 | 0.5744 |
0.1722 | 13.0 | 520 | 0.3881 | 0.581 | 0.6016 | 0.5911 |
0.1722 | 14.0 | 560 | 0.3938 | 0.562 | 0.5897 | 0.5756 |
0.1722 | 15.0 | 600 | 0.3891 | 0.5679 | 0.5937 | 0.5805 |
0.1722 | 16.0 | 640 | 0.3926 | 0.5647 | 0.5937 | 0.5788 |
0.1722 | 17.0 | 680 | 0.4031 | 0.5701 | 0.6016 | 0.5854 |
0.1722 | 18.0 | 720 | 0.4037 | 0.5744 | 0.5937 | 0.5839 |
0.1722 | 19.0 | 760 | 0.4029 | 0.572 | 0.5957 | 0.5836 |
0.1722 | 20.0 | 800 | 0.4030 | 0.572 | 0.5957 | 0.5836 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.6.0
- Tokenizers 0.21.1
- Downloads last month
- 3
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for arus05/2
Base model
google-t5/t5-base