results_T5
This model is a fine-tuned version of google/flan-t5-base on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.8755
- Rouge1: 0.2921
- Rouge2: 0.1519
- Rougel: 0.2857
- Rougelsum: 0.2866
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 8
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 25
Training results
Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum |
---|---|---|---|---|---|---|---|
No log | 1.0 | 109 | 0.4682 | 0.2349 | 0.0878 | 0.2327 | 0.2344 |
No log | 2.0 | 218 | 0.4153 | 0.2519 | 0.0965 | 0.2481 | 0.2503 |
No log | 3.0 | 327 | 0.4102 | 0.3011 | 0.1465 | 0.2979 | 0.2990 |
No log | 4.0 | 436 | 0.4386 | 0.2555 | 0.1138 | 0.2496 | 0.2496 |
0.8199 | 5.0 | 545 | 0.4784 | 0.2725 | 0.1188 | 0.2675 | 0.2665 |
0.8199 | 6.0 | 654 | 0.5088 | 0.2524 | 0.1066 | 0.2497 | 0.2501 |
0.8199 | 7.0 | 763 | 0.5680 | 0.2542 | 0.1093 | 0.2497 | 0.2496 |
0.8199 | 8.0 | 872 | 0.5982 | 0.2740 | 0.1375 | 0.2694 | 0.2698 |
0.8199 | 9.0 | 981 | 0.6575 | 0.2730 | 0.1368 | 0.2723 | 0.2714 |
0.0653 | 10.0 | 1090 | 0.6753 | 0.2822 | 0.1519 | 0.2798 | 0.2781 |
0.0653 | 11.0 | 1199 | 0.6923 | 0.2795 | 0.1486 | 0.2780 | 0.2774 |
0.0653 | 12.0 | 1308 | 0.7350 | 0.2471 | 0.1209 | 0.2458 | 0.2457 |
0.0653 | 13.0 | 1417 | 0.7698 | 0.2762 | 0.1463 | 0.2720 | 0.2733 |
0.0225 | 14.0 | 1526 | 0.7867 | 0.2771 | 0.1372 | 0.2763 | 0.2755 |
0.0225 | 15.0 | 1635 | 0.8166 | 0.3166 | 0.1689 | 0.3132 | 0.3133 |
0.0225 | 16.0 | 1744 | 0.8085 | 0.3027 | 0.1572 | 0.2998 | 0.3009 |
0.0225 | 17.0 | 1853 | 0.8162 | 0.3090 | 0.1734 | 0.3025 | 0.3038 |
0.0225 | 18.0 | 1962 | 0.8484 | 0.2965 | 0.1627 | 0.2917 | 0.2909 |
0.0105 | 19.0 | 2071 | 0.8610 | 0.2881 | 0.1487 | 0.2813 | 0.2819 |
0.0105 | 20.0 | 2180 | 0.8688 | 0.2811 | 0.1494 | 0.2755 | 0.2770 |
0.0105 | 21.0 | 2289 | 0.8733 | 0.2777 | 0.1453 | 0.2708 | 0.2724 |
0.0105 | 22.0 | 2398 | 0.8776 | 0.2771 | 0.1475 | 0.2709 | 0.2711 |
0.0061 | 23.0 | 2507 | 0.8717 | 0.2829 | 0.1467 | 0.2749 | 0.2749 |
0.0061 | 24.0 | 2616 | 0.8729 | 0.2878 | 0.1467 | 0.2803 | 0.2806 |
0.0061 | 25.0 | 2725 | 0.8755 | 0.2921 | 0.1519 | 0.2857 | 0.2866 |
Framework versions
- Transformers 4.33.2
- Pytorch 2.0.1+cu117
- Datasets 2.14.5
- Tokenizers 0.13.3
- Downloads last month
- 103
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for Logeswaransr/T5_MineAI_Prototype
Base model
google/flan-t5-base