st5s-es-inclusivo
This model is a fine-tuned version of flax-community/spanish-t5-small on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1779
- Score: 55.1958
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 12
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Score |
---|---|---|---|---|
No log | 1.0 | 105 | 0.5096 | 21.7567 |
No log | 2.0 | 210 | 0.3781 | 38.3010 |
No log | 3.0 | 315 | 0.2693 | 49.1274 |
No log | 4.0 | 420 | 0.2274 | 52.3318 |
0.5095 | 5.0 | 525 | 0.2107 | 52.9045 |
0.5095 | 6.0 | 630 | 0.1982 | 54.1138 |
0.5095 | 7.0 | 735 | 0.1903 | 54.5379 |
0.5095 | 8.0 | 840 | 0.1861 | 54.7896 |
0.5095 | 9.0 | 945 | 0.1815 | 55.0080 |
0.1941 | 10.0 | 1050 | 0.1796 | 55.2191 |
0.1941 | 11.0 | 1155 | 0.1784 | 55.1849 |
0.1941 | 12.0 | 1260 | 0.1779 | 55.1958 |
Framework versions
- Transformers 4.39.2
- Pytorch 2.2.2+cu121
- Datasets 2.18.0
- Tokenizers 0.15.2
- Downloads last month
- 8
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for Andresmfs/st5s-es-inclusivo
Base model
flax-community/spanish-t5-small