xmod-roberta-base-legal-multi-indian-downstream-build_rr
This model is a fine-tuned version of MHGanainy/xmod-roberta-base-legal-multi on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.9060
- Precision-macro: 0.6550
- Recall-macro: 0.5865
- Macro-f1: 0.6019
- Precision-micro: 0.7899
- Recall-micro: 0.7899
- Micro-f1: 0.7899
- Accuracy: 0.7899
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 1
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20.0
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Precision-macro | Recall-macro | Macro-f1 | Precision-micro | Recall-micro | Micro-f1 | Accuracy |
---|---|---|---|---|---|---|---|---|---|---|
No log | 1.0 | 124 | 0.8476 | 0.6248 | 0.4797 | 0.4920 | 0.7513 | 0.7513 | 0.7513 | 0.7513 |
No log | 2.0 | 248 | 0.8967 | 0.4955 | 0.5357 | 0.4972 | 0.6989 | 0.6989 | 0.6989 | 0.6989 |
No log | 3.0 | 372 | 0.7835 | 0.5899 | 0.5435 | 0.5489 | 0.7676 | 0.7676 | 0.7676 | 0.7676 |
No log | 4.0 | 496 | 0.7244 | 0.6551 | 0.5427 | 0.5414 | 0.7673 | 0.7673 | 0.7673 | 0.7673 |
0.9624 | 5.0 | 620 | 0.7272 | 0.5824 | 0.5938 | 0.5752 | 0.7642 | 0.7642 | 0.7642 | 0.7642 |
0.9624 | 6.0 | 744 | 0.7374 | 0.6266 | 0.5739 | 0.5727 | 0.7794 | 0.7794 | 0.7794 | 0.7794 |
0.9624 | 7.0 | 868 | 0.7612 | 0.6258 | 0.5706 | 0.5809 | 0.7850 | 0.7850 | 0.7850 | 0.7850 |
0.9624 | 8.0 | 992 | 0.7714 | 0.6249 | 0.5825 | 0.5941 | 0.7926 | 0.7926 | 0.7926 | 0.7926 |
0.4834 | 9.0 | 1116 | 0.8430 | 0.5990 | 0.5902 | 0.5897 | 0.7791 | 0.7791 | 0.7791 | 0.7791 |
0.4834 | 10.0 | 1240 | 0.8700 | 0.5904 | 0.5790 | 0.5806 | 0.7753 | 0.7753 | 0.7753 | 0.7753 |
0.4834 | 11.0 | 1364 | 0.9060 | 0.6550 | 0.5865 | 0.6019 | 0.7899 | 0.7899 | 0.7899 | 0.7899 |
Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu121
- Datasets 2.21.0
- Tokenizers 0.19.1
- Downloads last month
- 13
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
Model tree for MHGanainy/xmod-roberta-base-legal-multi-indian-downstream-build_rr
Base model
MHGanainy/xmod-roberta-base-legal-multi