Jerteh355SENTNEG2 / README.md
Tanor's picture
Upload RobertaForSequenceClassification
d8115f8 verified
metadata
license: cc-by-sa-4.0
tags:
  - generated_from_trainer
base_model: jerteh/Jerteh-355
metrics:
  - f1
model-index:
  - name: Jerteh355SENTNEG2
    results: []

Jerteh355SENTNEG2

This model is a fine-tuned version of jerteh/Jerteh-355 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0467
  • F1: 0.6222

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 64
  • eval_batch_size: 16
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 256
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 32

Training results

Training Loss Epoch Step Validation Loss F1
No log 0.9895 47 0.0332 0.6364
No log 2.0 95 0.0388 0.6667
No log 2.9895 142 0.0363 0.6667
No log 4.0 190 0.0467 0.6222

Framework versions

  • Transformers 4.40.1
  • Pytorch 2.2.2
  • Datasets 2.19.0
  • Tokenizers 0.19.1