swinv2-tiny-patch4-window8-256-dmae-humeda-DAV48

This model is a fine-tuned version of microsoft/swinv2-tiny-patch4-window8-256 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7283
  • Accuracy: 0.75

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: reduce_lr_on_plateau
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 40
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 0.9412 8 1.4387 0.4432
1.5789 1.9412 16 1.3131 0.5568
1.3907 2.9412 24 1.1805 0.5909
1.3907 3.9412 32 1.0386 0.6136
1.1967 4.9412 40 1.0065 0.6136
1.0098 5.9412 48 0.8786 0.6477
1.0098 6.9412 56 0.8264 0.6932
0.863 7.9412 64 0.8026 0.7273
0.7309 8.9412 72 0.7853 0.7159
0.7309 9.9412 80 0.7649 0.7273
0.6597 10.9412 88 0.7671 0.7386
0.56 11.9412 96 0.7551 0.7159
0.56 12.9412 104 0.7428 0.7273
0.5207 13.9412 112 0.7396 0.7273
0.5108 14.9412 120 0.7368 0.7273
0.5108 15.9412 128 0.7366 0.7386
0.5062 16.9412 136 0.7364 0.7273
0.5069 17.9412 144 0.7329 0.7386
0.5069 18.9412 152 0.7285 0.7273
0.4952 19.9412 160 0.7371 0.7386
0.4979 20.9412 168 0.7436 0.7386
0.4979 21.9412 176 0.7338 0.7386
0.4745 22.9412 184 0.7291 0.75
0.4735 23.9412 192 0.7305 0.75
0.4735 24.9412 200 0.7301 0.75
0.4862 25.9412 208 0.7283 0.75
0.4955 26.9412 216 0.7273 0.75
0.4955 27.9412 224 0.7275 0.75
0.4602 28.9412 232 0.7280 0.75
0.4714 29.9412 240 0.7291 0.75
0.4714 30.9412 248 0.7298 0.75
0.4727 31.9412 256 0.7301 0.75
0.4689 32.9412 264 0.7293 0.75
0.4689 33.9412 272 0.7287 0.75
0.4725 34.9412 280 0.7287 0.75
0.4747 35.9412 288 0.7284 0.75
0.4747 36.9412 296 0.7284 0.75
0.5012 37.9412 304 0.7284 0.75
0.462 38.9412 312 0.7286 0.75
0.462 39.9412 320 0.7283 0.75

Framework versions

  • Transformers 4.48.2
  • Pytorch 2.5.1+cu124
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
236
Safetensors
Model size
27.6M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for RobertoSonic/swinv2-tiny-patch4-window8-256-dmae-humeda-DAV48

Finetuned
(120)
this model