smids_10x_deit_base_sgd_00001_fold3

This model is a fine-tuned version of facebook/deit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0056
  • Accuracy: 0.5583

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.1044 1.0 750 1.1039 0.36
1.0936 2.0 1500 1.1001 0.3683
1.0833 3.0 2250 1.0965 0.385
1.0922 4.0 3000 1.0929 0.3867
1.0937 5.0 3750 1.0895 0.3933
1.0793 6.0 4500 1.0862 0.4083
1.0808 7.0 5250 1.0829 0.4133
1.0795 8.0 6000 1.0798 0.42
1.0735 9.0 6750 1.0767 0.4283
1.0874 10.0 7500 1.0736 0.435
1.0661 11.0 8250 1.0706 0.4483
1.038 12.0 9000 1.0677 0.445
1.0511 13.0 9750 1.0648 0.45
1.0539 14.0 10500 1.0620 0.4617
1.0408 15.0 11250 1.0592 0.46
1.0259 16.0 12000 1.0564 0.4617
1.0339 17.0 12750 1.0537 0.47
1.0416 18.0 13500 1.0511 0.4683
1.0423 19.0 14250 1.0484 0.4733
1.0333 20.0 15000 1.0459 0.4817
1.0189 21.0 15750 1.0434 0.485
1.0248 22.0 16500 1.0409 0.4917
1.0226 23.0 17250 1.0385 0.495
1.0228 24.0 18000 1.0362 0.5
1.0292 25.0 18750 1.0339 0.4983
1.02 26.0 19500 1.0317 0.5
1.0227 27.0 20250 1.0296 0.5017
1.0121 28.0 21000 1.0275 0.5083
1.0052 29.0 21750 1.0255 0.5183
0.9924 30.0 22500 1.0236 0.52
1.0074 31.0 23250 1.0218 0.5267
0.9858 32.0 24000 1.0200 0.5267
1.0028 33.0 24750 1.0184 0.5283
0.9845 34.0 25500 1.0169 0.5317
0.992 35.0 26250 1.0154 0.535
0.9814 36.0 27000 1.0141 0.535
1.0022 37.0 27750 1.0128 0.535
0.9973 38.0 28500 1.0117 0.5367
0.9952 39.0 29250 1.0106 0.54
0.9809 40.0 30000 1.0097 0.5417
0.9816 41.0 30750 1.0089 0.5467
1.0061 42.0 31500 1.0081 0.5533
0.9998 43.0 32250 1.0075 0.555
0.992 44.0 33000 1.0070 0.555
0.9773 45.0 33750 1.0065 0.5567
0.963 46.0 34500 1.0062 0.5567
0.9849 47.0 35250 1.0059 0.5567
0.9816 48.0 36000 1.0057 0.5583
0.999 49.0 36750 1.0056 0.5583
0.9748 50.0 37500 1.0056 0.5583

Framework versions

  • Transformers 4.32.1
  • Pytorch 2.1.0+cu121
  • Datasets 2.12.0
  • Tokenizers 0.13.2
Downloads last month
7
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for hkivancoral/smids_10x_deit_base_sgd_00001_fold3

Finetuned
(268)
this model

Evaluation results