swinv2-tiny-patch4-window8-256-dmae-humeda-DAV37
This model is a fine-tuned version of microsoft/swinv2-tiny-patch4-window8-256 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 1.8351
- Accuracy: 0.6538
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 30
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
No log | 0.8 | 3 | 1.6284 | 0.1346 |
No log | 1.8 | 6 | 1.5966 | 0.2404 |
No log | 2.8 | 9 | 1.5076 | 0.3942 |
6.28 | 3.8 | 12 | 1.2912 | 0.4615 |
6.28 | 4.8 | 15 | 1.2137 | 0.5096 |
6.28 | 5.8 | 18 | 1.1917 | 0.5385 |
6.28 | 6.8 | 21 | 1.1498 | 0.5673 |
2.9539 | 7.8 | 24 | 1.2026 | 0.5865 |
2.9539 | 8.8 | 27 | 1.2711 | 0.5962 |
2.9539 | 9.8 | 30 | 1.3534 | 0.625 |
2.9539 | 10.8 | 33 | 1.3210 | 0.625 |
0.9643 | 11.8 | 36 | 1.3940 | 0.6346 |
0.9643 | 12.8 | 39 | 1.4859 | 0.6346 |
0.9643 | 13.8 | 42 | 1.4965 | 0.6346 |
0.9643 | 14.8 | 45 | 1.5463 | 0.625 |
0.3275 | 15.8 | 48 | 1.5885 | 0.6346 |
0.3275 | 16.8 | 51 | 1.6466 | 0.6442 |
0.3275 | 17.8 | 54 | 1.8351 | 0.6538 |
0.3275 | 18.8 | 57 | 1.8326 | 0.6442 |
0.1501 | 19.8 | 60 | 1.7521 | 0.6346 |
0.1501 | 20.8 | 63 | 1.7806 | 0.6538 |
0.1501 | 21.8 | 66 | 1.7669 | 0.6538 |
0.1501 | 22.8 | 69 | 1.8874 | 0.6346 |
0.09 | 23.8 | 72 | 1.8827 | 0.6538 |
0.09 | 24.8 | 75 | 1.8330 | 0.6538 |
0.09 | 25.8 | 78 | 1.8331 | 0.6538 |
0.09 | 26.8 | 81 | 1.8410 | 0.6538 |
0.0595 | 27.8 | 84 | 1.8441 | 0.6442 |
0.0595 | 28.8 | 87 | 1.8444 | 0.6538 |
0.0595 | 29.8 | 90 | 1.8446 | 0.6538 |
Framework versions
- Transformers 4.47.1
- Pytorch 2.5.1+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 11
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
Model tree for RobertoSonic/swinv2-tiny-patch4-window8-256-dmae-humeda-DAV37
Base model
microsoft/swinv2-tiny-patch4-window8-256