swinv2-tiny-patch4-window8-256-dmae-humeda-DAV53

This model is a fine-tuned version of microsoft/swinv2-tiny-patch4-window8-256 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8283
  • Accuracy: 0.7045

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 16
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 8
  • total_train_batch_size: 128
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine_with_restarts
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 30
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 6 1.6493 0.0909
No log 2.0 12 1.5699 0.3864
No log 3.0 18 1.4384 0.4205
No log 4.0 24 1.2748 0.4091
No log 5.0 30 1.2428 0.5114
No log 6.0 36 1.0682 0.6023
No log 7.0 42 1.2919 0.5
No log 8.0 48 0.9125 0.6591
No log 9.0 54 1.0308 0.5568
No log 10.0 60 0.8505 0.6705
No log 11.0 66 0.9354 0.625
No log 12.0 72 0.8283 0.7045
No log 13.0 78 0.8508 0.6705
No log 14.0 84 0.8072 0.6477
No log 15.0 90 0.8574 0.6477
No log 16.0 96 0.8278 0.625
0.7213 17.0 102 0.8671 0.6364
0.7213 18.0 108 0.8787 0.6364
0.7213 19.0 114 0.8215 0.6818
0.7213 20.0 120 0.8018 0.6932
0.7213 21.0 126 0.8278 0.6477
0.7213 22.0 132 0.8424 0.6364
0.7213 23.0 138 0.8392 0.625
0.7213 24.0 144 0.8371 0.625
0.7213 25.0 150 0.8373 0.625

Framework versions

  • Transformers 4.48.2
  • Pytorch 2.5.1+cu124
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
215
Safetensors
Model size
27.6M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for RobertoSonic/swinv2-tiny-patch4-window8-256-dmae-humeda-DAV53

Finetuned
(120)
this model