smids_10x_deit_base_sgd_00001_fold2

This model is a fine-tuned version of facebook/deit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0091
  • Accuracy: 0.5541

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.1301 1.0 750 1.1044 0.3361
1.0956 2.0 1500 1.1008 0.3394
1.0885 3.0 2250 1.0973 0.3527
1.0813 4.0 3000 1.0939 0.3677
1.0897 5.0 3750 1.0907 0.3794
1.0999 6.0 4500 1.0875 0.3794
1.0837 7.0 5250 1.0844 0.3810
1.0633 8.0 6000 1.0813 0.3860
1.0777 9.0 6750 1.0784 0.3977
1.0672 10.0 7500 1.0755 0.4027
1.0688 11.0 8250 1.0726 0.4143
1.0524 12.0 9000 1.0698 0.4193
1.0588 13.0 9750 1.0670 0.4276
1.0355 14.0 10500 1.0642 0.4443
1.0421 15.0 11250 1.0615 0.4526
1.0517 16.0 12000 1.0588 0.4526
1.0228 17.0 12750 1.0561 0.4609
1.052 18.0 13500 1.0535 0.4626
1.0453 19.0 14250 1.0509 0.4742
1.0307 20.0 15000 1.0483 0.4842
1.0308 21.0 15750 1.0459 0.4892
1.0369 22.0 16500 1.0434 0.5008
1.0173 23.0 17250 1.0411 0.5008
1.0178 24.0 18000 1.0388 0.5058
1.021 25.0 18750 1.0366 0.5075
1.0167 26.0 19500 1.0344 0.5075
1.0247 27.0 20250 1.0323 0.5125
1.0234 28.0 21000 1.0303 0.5208
1.003 29.0 21750 1.0283 0.5258
1.008 30.0 22500 1.0265 0.5308
1.0192 31.0 23250 1.0247 0.5341
1.0098 32.0 24000 1.0231 0.5374
0.9987 33.0 24750 1.0215 0.5408
0.9994 34.0 25500 1.0200 0.5408
1.0171 35.0 26250 1.0186 0.5408
1.013 36.0 27000 1.0173 0.5441
0.988 37.0 27750 1.0161 0.5441
0.9905 38.0 28500 1.0150 0.5491
0.9967 39.0 29250 1.0140 0.5491
0.9901 40.0 30000 1.0131 0.5557
0.9977 41.0 30750 1.0123 0.5557
1.0045 42.0 31500 1.0116 0.5557
0.9983 43.0 32250 1.0109 0.5541
0.9818 44.0 33000 1.0104 0.5541
0.9768 45.0 33750 1.0100 0.5541
0.9827 46.0 34500 1.0096 0.5541
0.9904 47.0 35250 1.0094 0.5541
0.9884 48.0 36000 1.0092 0.5541
0.9851 49.0 36750 1.0091 0.5541
0.9904 50.0 37500 1.0091 0.5541

Framework versions

  • Transformers 4.32.1
  • Pytorch 2.1.0+cu121
  • Datasets 2.12.0
  • Tokenizers 0.13.2
Downloads last month
10
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for hkivancoral/smids_10x_deit_base_sgd_00001_fold2

Finetuned
(268)
this model

Evaluation results