smids_10x_deit_base_sgd_001_fold5

This model is a fine-tuned version of facebook/deit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2447
  • Accuracy: 0.8967

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.5962 1.0 750 0.5861 0.7883
0.4051 2.0 1500 0.4173 0.8267
0.3569 3.0 2250 0.3612 0.8367
0.3447 4.0 3000 0.3341 0.85
0.3363 5.0 3750 0.3169 0.855
0.2736 6.0 4500 0.3026 0.8683
0.2339 7.0 5250 0.2954 0.87
0.2686 8.0 6000 0.2855 0.8683
0.2668 9.0 6750 0.2807 0.8733
0.247 10.0 7500 0.2762 0.8783
0.2811 11.0 8250 0.2739 0.89
0.2638 12.0 9000 0.2726 0.8833
0.2445 13.0 9750 0.2668 0.8883
0.245 14.0 10500 0.2627 0.8883
0.2557 15.0 11250 0.2593 0.8867
0.1782 16.0 12000 0.2589 0.8867
0.2171 17.0 12750 0.2586 0.8883
0.1998 18.0 13500 0.2548 0.8933
0.2462 19.0 14250 0.2572 0.8917
0.1609 20.0 15000 0.2549 0.8933
0.1833 21.0 15750 0.2494 0.895
0.2212 22.0 16500 0.2509 0.895
0.2078 23.0 17250 0.2493 0.895
0.1922 24.0 18000 0.2508 0.8983
0.2035 25.0 18750 0.2506 0.8933
0.1816 26.0 19500 0.2465 0.8967
0.1488 27.0 20250 0.2466 0.8983
0.1736 28.0 21000 0.2478 0.8967
0.1851 29.0 21750 0.2450 0.8967
0.2091 30.0 22500 0.2502 0.8933
0.1735 31.0 23250 0.2445 0.8983
0.1511 32.0 24000 0.2473 0.895
0.1917 33.0 24750 0.2450 0.895
0.1536 34.0 25500 0.2464 0.8983
0.1399 35.0 26250 0.2436 0.895
0.1867 36.0 27000 0.2448 0.8983
0.1193 37.0 27750 0.2459 0.8967
0.1456 38.0 28500 0.2448 0.9017
0.1489 39.0 29250 0.2453 0.8967
0.1393 40.0 30000 0.2452 0.8983
0.1841 41.0 30750 0.2456 0.8967
0.1682 42.0 31500 0.2446 0.8967
0.1428 43.0 32250 0.2457 0.9
0.1636 44.0 33000 0.2455 0.8967
0.1783 45.0 33750 0.2453 0.8983
0.1167 46.0 34500 0.2456 0.8967
0.1786 47.0 35250 0.2449 0.8967
0.1666 48.0 36000 0.2447 0.8983
0.1479 49.0 36750 0.2447 0.8967
0.099 50.0 37500 0.2447 0.8967

Framework versions

  • Transformers 4.32.1
  • Pytorch 2.1.0+cu121
  • Datasets 2.12.0
  • Tokenizers 0.13.2
Downloads last month
8
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for hkivancoral/smids_10x_deit_base_sgd_001_fold5

Finetuned
(268)
this model

Evaluation results