RobertoSonic's picture
Model save
457d296 verified
metadata
library_name: transformers
license: apache-2.0
base_model: microsoft/swinv2-tiny-patch4-window8-256
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: swinv2-tiny-patch4-window8-256-dmae-humeda-DAV47
    results: []

swinv2-tiny-patch4-window8-256-dmae-humeda-DAV47

This model is a fine-tuned version of microsoft/swinv2-tiny-patch4-window8-256 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9284
  • Accuracy: 0.75

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine_with_restarts
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 40
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 0.9412 8 1.5602 0.3409
1.6237 1.9412 16 1.3767 0.4432
1.4913 2.9412 24 1.3316 0.6136
1.4913 3.9412 32 1.0605 0.6591
1.2218 4.9412 40 0.9235 0.6932
0.9148 5.9412 48 0.8240 0.75
0.9148 6.9412 56 0.7359 0.6932
0.7686 7.9412 64 0.7190 0.6932
0.6291 8.9412 72 0.6824 0.7273
0.6291 9.9412 80 0.7034 0.7614
0.5546 10.9412 88 0.6911 0.7727
0.4494 11.9412 96 0.6893 0.75
0.4494 12.9412 104 0.6927 0.7727
0.3719 13.9412 112 0.7180 0.7955
0.3478 14.9412 120 0.7574 0.7159
0.3478 15.9412 128 0.7665 0.7159
0.3212 16.9412 136 0.8369 0.7386
0.3184 17.9412 144 0.7906 0.7159
0.3184 18.9412 152 0.8438 0.7273
0.2873 19.9412 160 0.8233 0.7273
0.2553 20.9412 168 0.8062 0.7386
0.2553 21.9412 176 0.8711 0.7159
0.2373 22.9412 184 0.8673 0.7386
0.2208 23.9412 192 0.8600 0.7273
0.2208 24.9412 200 0.8984 0.7159
0.2353 25.9412 208 0.8848 0.7273
0.2187 26.9412 216 0.8569 0.75
0.2187 27.9412 224 0.8817 0.7386
0.1943 28.9412 232 0.8949 0.75
0.1926 29.9412 240 0.9077 0.7159
0.1926 30.9412 248 0.9200 0.7159
0.1816 31.9412 256 0.9233 0.7386
0.1744 32.9412 264 0.9231 0.7386
0.1744 33.9412 272 0.9329 0.7273
0.1718 34.9412 280 0.9277 0.7386
0.1701 35.9412 288 0.9258 0.75
0.1701 36.9412 296 0.9262 0.75
0.1921 37.9412 304 0.9274 0.75
0.161 38.9412 312 0.9282 0.75
0.161 39.9412 320 0.9284 0.75

Framework versions

  • Transformers 4.48.2
  • Pytorch 2.5.1+cu124
  • Datasets 3.2.0
  • Tokenizers 0.21.0