beit-base-patch16-224-pt22k-finetuned-galaxy10-decals

This model is a fine-tuned version of microsoft/beit-base-patch16-224-pt22k on the matthieulel/galaxy10_decals dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4135
  • Accuracy: 0.8653
  • Precision: 0.8629
  • Recall: 0.8653
  • F1: 0.8629

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 256
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Accuracy Precision Recall F1
1.7405 0.99 62 1.4465 0.4634 0.4965 0.4634 0.4178
1.0688 2.0 125 0.8733 0.7108 0.7043 0.7108 0.6971
0.8939 2.99 187 0.7320 0.7373 0.7472 0.7373 0.7331
0.7122 4.0 250 0.6952 0.7700 0.7632 0.7700 0.7596
0.6654 4.99 312 0.5980 0.7914 0.7917 0.7914 0.7910
0.667 6.0 375 0.6021 0.7892 0.7996 0.7892 0.7822
0.615 6.99 437 0.5403 0.8083 0.8098 0.8083 0.8019
0.5685 8.0 500 0.4871 0.8230 0.8203 0.8230 0.8200
0.5764 8.99 562 0.4781 0.8315 0.8332 0.8315 0.8304
0.5372 10.0 625 0.4768 0.8331 0.8327 0.8331 0.8285
0.5361 10.99 687 0.4883 0.8354 0.8368 0.8354 0.8338
0.5051 12.0 750 0.4693 0.8354 0.8338 0.8354 0.8328
0.4938 12.99 812 0.4517 0.8416 0.8418 0.8416 0.8396
0.4883 14.0 875 0.4434 0.8472 0.8487 0.8472 0.8442
0.4544 14.99 937 0.4522 0.8427 0.8411 0.8427 0.8395
0.4352 16.0 1000 0.4369 0.8512 0.8493 0.8512 0.8488
0.4107 16.99 1062 0.4545 0.8540 0.8545 0.8540 0.8504
0.4354 18.0 1125 0.4304 0.8563 0.8553 0.8563 0.8550
0.4337 18.99 1187 0.4293 0.8579 0.8572 0.8579 0.8564
0.4252 20.0 1250 0.4329 0.8506 0.8535 0.8506 0.8473
0.3923 20.99 1312 0.4171 0.8602 0.8602 0.8602 0.8586
0.4216 22.0 1375 0.4191 0.8568 0.8574 0.8568 0.8548
0.3847 22.99 1437 0.4378 0.8517 0.8509 0.8517 0.8509
0.3606 24.0 1500 0.4403 0.8585 0.8571 0.8585 0.8567
0.3739 24.99 1562 0.4228 0.8619 0.8598 0.8619 0.8594
0.3291 26.0 1625 0.4112 0.8602 0.8579 0.8602 0.8587
0.3441 26.99 1687 0.4214 0.8608 0.8592 0.8608 0.8588
0.3649 28.0 1750 0.4135 0.8653 0.8629 0.8653 0.8629
0.345 28.99 1812 0.4177 0.8608 0.8594 0.8608 0.8589
0.3435 29.76 1860 0.4188 0.8625 0.8610 0.8625 0.8606

Framework versions

  • Transformers 4.37.2
  • Pytorch 2.3.0
  • Datasets 2.19.1
  • Tokenizers 0.15.1
Downloads last month
10
Safetensors
Model size
85.7M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for matthieulel/beit-base-patch16-224-pt22k-finetuned-galaxy10-decals

Finetuned
(2)
this model

Collection including matthieulel/beit-base-patch16-224-pt22k-finetuned-galaxy10-decals