distilroberta-rbm231k-ep20-op40-phr2
This model is a fine-tuned version of judy93536/distilroberta-rbm231k-ep20-op40 on the financial_phrasebank dataset. It achieves the following results on the evaluation set:
- Loss: 0.1510
- Accuracy: 0.9558
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1.153335054745316e-06
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.4
- num_epochs: 30
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
No log | 1.0 | 114 | 1.0772 | 0.5320 |
No log | 2.0 | 228 | 1.0494 | 0.6159 |
No log | 3.0 | 342 | 0.9975 | 0.6181 |
No log | 4.0 | 456 | 0.9147 | 0.6181 |
1.0398 | 5.0 | 570 | 0.8565 | 0.6181 |
1.0398 | 6.0 | 684 | 0.8305 | 0.6181 |
1.0398 | 7.0 | 798 | 0.7759 | 0.6181 |
1.0398 | 8.0 | 912 | 0.7302 | 0.6490 |
0.8173 | 9.0 | 1026 | 0.6873 | 0.6865 |
0.8173 | 10.0 | 1140 | 0.6445 | 0.7174 |
0.8173 | 11.0 | 1254 | 0.6036 | 0.7439 |
0.8173 | 12.0 | 1368 | 0.5528 | 0.7550 |
0.8173 | 13.0 | 1482 | 0.5247 | 0.7550 |
0.5972 | 14.0 | 1596 | 0.4776 | 0.7572 |
0.5972 | 15.0 | 1710 | 0.4430 | 0.7616 |
0.5972 | 16.0 | 1824 | 0.3948 | 0.7704 |
0.5972 | 17.0 | 1938 | 0.3418 | 0.8455 |
0.4037 | 18.0 | 2052 | 0.2924 | 0.9029 |
0.4037 | 19.0 | 2166 | 0.2486 | 0.9249 |
0.4037 | 20.0 | 2280 | 0.2049 | 0.9360 |
0.4037 | 21.0 | 2394 | 0.1854 | 0.9470 |
0.2072 | 22.0 | 2508 | 0.1803 | 0.9470 |
0.2072 | 23.0 | 2622 | 0.1634 | 0.9514 |
0.2072 | 24.0 | 2736 | 0.1615 | 0.9536 |
0.2072 | 25.0 | 2850 | 0.1560 | 0.9536 |
0.2072 | 26.0 | 2964 | 0.1512 | 0.9558 |
0.1222 | 27.0 | 3078 | 0.1470 | 0.9558 |
0.1222 | 28.0 | 3192 | 0.1519 | 0.9536 |
0.1222 | 29.0 | 3306 | 0.1505 | 0.9558 |
0.1222 | 30.0 | 3420 | 0.1510 | 0.9558 |
Framework versions
- Transformers 4.35.2
- Pytorch 2.1.0+cu118
- Datasets 2.15.0
- Tokenizers 0.15.0
- Downloads last month
- 17
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
Model tree for judy93536/distilroberta-rbm231k-ep20-op40-phr2
Base model
distilbert/distilroberta-base