sercetexam9's picture
Training completed!
8d050e9 verified
metadata
library_name: transformers
license: mit
base_model: FacebookAI/roberta-large
tags:
  - generated_from_trainer
metrics:
  - f1
  - accuracy
model-index:
  - name: roberta-large-finetuned-augmentation-LUNAR
    results: []

roberta-large-finetuned-augmentation-LUNAR

This model is a fine-tuned version of FacebookAI/roberta-large on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6061
  • F1: 0.7909
  • Roc Auc: 0.8390
  • Accuracy: 0.5680

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_steps: 100
  • num_epochs: 20

Training results

Training Loss Epoch Step Validation Loss F1 Roc Auc Accuracy
0.4606 1.0 179 0.3928 0.5956 0.7155 0.4320
0.3171 2.0 358 0.3380 0.7156 0.7768 0.4727
0.2294 3.0 537 0.3398 0.7321 0.7927 0.5077
0.1528 4.0 716 0.3813 0.7577 0.8113 0.5175
0.0887 5.0 895 0.4250 0.7669 0.8306 0.5175
0.0583 6.0 1074 0.4355 0.7686 0.8278 0.5273
0.0448 7.0 1253 0.5045 0.7498 0.8029 0.5316
0.0298 8.0 1432 0.4862 0.7809 0.8321 0.5554
0.0227 9.0 1611 0.5282 0.7793 0.8248 0.5484
0.0111 10.0 1790 0.5567 0.7787 0.8340 0.5428
0.0082 11.0 1969 0.5762 0.7845 0.8408 0.5498
0.0055 12.0 2148 0.5771 0.7796 0.8325 0.5582
0.0032 13.0 2327 0.5884 0.7865 0.8336 0.5610
0.003 14.0 2506 0.6064 0.7901 0.8380 0.5568
0.0024 15.0 2685 0.6061 0.7909 0.8390 0.5680
0.002 16.0 2864 0.6041 0.7878 0.8399 0.5736
0.0016 17.0 3043 0.6129 0.7848 0.8346 0.5596
0.0014 18.0 3222 0.6129 0.7860 0.8366 0.5694
0.0038 19.0 3401 0.6143 0.7893 0.8400 0.5722

Framework versions

  • Transformers 4.45.1
  • Pytorch 2.4.0
  • Datasets 3.0.1
  • Tokenizers 0.20.0