gysbert_historical_fmp2_ogtok_output_emotion_primary
This model was trained from scratch on the None dataset. It achieves the following results on the evaluation set:
- Loss: 1.0069
- Accuracy: 0.7123
- F1: 0.5402
- Precision: 0.6014
- Recall: 0.5229
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-06
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 200
- num_epochs: 30
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall |
---|---|---|---|---|---|---|---|
1.6286 | 0.3030 | 100 | 1.4354 | 0.5622 | 0.1200 | 0.0937 | 0.1667 |
1.2292 | 0.6061 | 200 | 1.0157 | 0.6065 | 0.2132 | 0.2454 | 0.2536 |
1.0055 | 0.9091 | 300 | 0.9542 | 0.6491 | 0.2868 | 0.4298 | 0.3134 |
0.9543 | 1.2121 | 400 | 0.9205 | 0.6610 | 0.3287 | 0.3193 | 0.3487 |
0.8554 | 1.5152 | 500 | 0.8846 | 0.6797 | 0.3406 | 0.3563 | 0.3626 |
0.8496 | 1.8182 | 600 | 0.8515 | 0.6780 | 0.3376 | 0.3400 | 0.3606 |
0.7995 | 2.1212 | 700 | 0.8363 | 0.6797 | 0.3599 | 0.3510 | 0.3806 |
0.7274 | 2.4242 | 800 | 0.8196 | 0.6814 | 0.3748 | 0.5206 | 0.3849 |
0.7402 | 2.7273 | 900 | 0.8130 | 0.6848 | 0.4108 | 0.4717 | 0.4126 |
0.7739 | 3.0303 | 1000 | 0.8050 | 0.6951 | 0.4295 | 0.4618 | 0.4362 |
0.6437 | 3.3333 | 1100 | 0.8085 | 0.6951 | 0.4475 | 0.5600 | 0.4309 |
0.653 | 3.6364 | 1200 | 0.8192 | 0.6917 | 0.4540 | 0.5471 | 0.4510 |
0.6566 | 3.9394 | 1300 | 0.8089 | 0.6917 | 0.4642 | 0.5545 | 0.4568 |
0.5754 | 4.2424 | 1400 | 0.8122 | 0.6968 | 0.4751 | 0.5435 | 0.4710 |
0.5799 | 4.5455 | 1500 | 0.8404 | 0.6882 | 0.4636 | 0.5242 | 0.4669 |
0.5898 | 4.8485 | 1600 | 0.8306 | 0.7104 | 0.5037 | 0.5878 | 0.4894 |
0.4972 | 5.1515 | 1700 | 0.8506 | 0.6985 | 0.4996 | 0.5380 | 0.4953 |
0.5047 | 5.4545 | 1800 | 0.8497 | 0.7002 | 0.4994 | 0.5374 | 0.4930 |
0.4803 | 5.7576 | 1900 | 0.8772 | 0.6968 | 0.4752 | 0.5421 | 0.4634 |
0.4932 | 6.0606 | 2000 | 0.8752 | 0.6899 | 0.4817 | 0.5110 | 0.4777 |
0.4104 | 6.3636 | 2100 | 0.8885 | 0.6985 | 0.4884 | 0.5257 | 0.4849 |
0.4255 | 6.6667 | 2200 | 0.9097 | 0.6882 | 0.4862 | 0.5055 | 0.4855 |
0.4752 | 6.9697 | 2300 | 0.9153 | 0.7019 | 0.4990 | 0.5266 | 0.4968 |
0.3682 | 7.2727 | 2400 | 0.9370 | 0.6865 | 0.4866 | 0.5051 | 0.4934 |
0.3738 | 7.5758 | 2500 | 0.9578 | 0.6882 | 0.4779 | 0.5069 | 0.4764 |
0.3342 | 7.8788 | 2600 | 0.9608 | 0.6917 | 0.5202 | 0.5305 | 0.5221 |
0.3324 | 8.1818 | 2700 | 0.9839 | 0.6797 | 0.5076 | 0.5069 | 0.5174 |
0.3247 | 8.4848 | 2800 | 0.9890 | 0.6814 | 0.4860 | 0.5048 | 0.4888 |
0.3101 | 8.7879 | 2900 | 1.0068 | 0.6814 | 0.4793 | 0.5039 | 0.4832 |
0.2835 | 9.0909 | 3000 | 1.0158 | 0.6882 | 0.5092 | 0.5262 | 0.5085 |
0.2792 | 9.3939 | 3100 | 1.0497 | 0.6831 | 0.4888 | 0.5233 | 0.4885 |
0.2426 | 9.6970 | 3200 | 1.0563 | 0.6882 | 0.5163 | 0.5227 | 0.5182 |
0.2636 | 10.0 | 3300 | 1.0859 | 0.6882 | 0.5219 | 0.5317 | 0.5278 |
0.2281 | 10.3030 | 3400 | 1.1076 | 0.6831 | 0.5040 | 0.5189 | 0.5073 |
0.238 | 10.6061 | 3500 | 1.1152 | 0.6865 | 0.4896 | 0.5059 | 0.4945 |
0.1861 | 10.9091 | 3600 | 1.1460 | 0.6848 | 0.4907 | 0.5062 | 0.4928 |
Framework versions
- Transformers 4.40.2
- Pytorch 2.1.2
- Datasets 2.18.0
- Tokenizers 0.19.1
- Downloads last month
- 0
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no library tag.