swinv2-tiny-patch4-window8-256-finetuned-eurosat

This model is a fine-tuned version of microsoft/swinv2-tiny-patch4-window8-256 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0776
  • Accuracy: 0.9686
  • Precision Overall: 0.9697
  • Recall Overall: 0.9686
  • F1 Overall: 0.9679
  • Precision T0: 0.9333
  • Recall T0: 0.7368
  • F1 T0: 0.8235
  • Precision T1: 0.8893
  • Recall T1: 0.9765
  • F1 T1: 0.9308
  • Precision T2: 0.9939
  • Recall T2: 0.9889
  • F1 T2: 0.9914

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 128
  • eval_batch_size: 128
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 512
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy Precision Overall Recall Overall F1 Overall Precision T0 Recall T0 F1 T0 Precision T1 Recall T1 F1 T1 Precision T2 Recall T2 F1 T2
1.4522 0.9524 10 1.1098 0.2801 0.7200 0.2801 0.3433 0.1199 0.9053 0.2118 0.0303 0.0392 0.0342 0.9555 0.2821 0.4356
0.9137 1.9286 20 0.6271 0.7386 0.5455 0.7386 0.6276 0.0 0.0 0.0 0.0 0.0 0.0 0.7386 1.0 0.8497
0.6332 2.9048 30 0.5053 0.7498 0.6819 0.7498 0.6553 0.0 0.0 0.0 0.6667 0.0627 0.1147 0.7513 0.9990 0.8576
0.4977 3.9762 41 0.3550 0.8656 0.8133 0.8656 0.8374 0.0 0.0 0.0 0.6511 0.8196 0.7257 0.9332 0.9606 0.9467
0.396 4.9524 51 0.3010 0.8820 0.8191 0.8820 0.8494 0.0 0.0 0.0 0.7279 0.7765 0.7514 0.9213 0.9939 0.9562
0.3438 5.9286 61 0.2886 0.8730 0.8732 0.8730 0.8659 0.4182 0.4842 0.4488 0.8424 0.5451 0.6619 0.9248 0.9949 0.9586
0.3112 6.9048 71 0.2835 0.8574 0.8846 0.8574 0.8492 0.3626 0.6526 0.4662 0.8850 0.3922 0.5435 0.9346 0.9970 0.9648
0.2826 7.9762 82 0.2100 0.9149 0.9115 0.9149 0.9103 0.6437 0.5895 0.6154 0.9005 0.7098 0.7939 0.9401 0.9990 0.9686
0.2627 8.9524 92 0.2164 0.9081 0.9144 0.9081 0.9060 0.5484 0.7158 0.6210 0.9066 0.6471 0.7551 0.9516 0.9939 0.9723
0.2385 9.9286 102 0.2145 0.9089 0.9217 0.9089 0.9065 0.5515 0.7895 0.6494 0.9576 0.6196 0.7524 0.9480 0.9949 0.9709
0.2448 10.9048 112 0.2168 0.9111 0.9216 0.9111 0.9093 0.5746 0.8105 0.6725 0.9425 0.6431 0.7646 0.9496 0.9899 0.9693
0.2353 11.9762 123 0.1721 0.9276 0.9302 0.9276 0.9252 0.6729 0.7579 0.7129 0.9534 0.7216 0.8214 0.9490 0.9970 0.9724
0.2207 12.9524 133 0.1434 0.9447 0.9431 0.9447 0.9418 0.8986 0.6526 0.7561 0.9185 0.8392 0.8770 0.9537 1.0 0.9763
0.1997 13.9286 143 0.1606 0.9283 0.9290 0.9283 0.9264 0.6635 0.7263 0.6935 0.9220 0.7412 0.8217 0.9563 0.9960 0.9757
0.2022 14.9048 153 0.1486 0.9380 0.9392 0.9380 0.9329 0.9796 0.5053 0.6667 0.8544 0.8745 0.8643 0.9572 0.9960 0.9762
0.1899 15.9762 164 0.1402 0.9425 0.9420 0.9425 0.9409 0.7812 0.7895 0.7853 0.9355 0.7961 0.8602 0.9591 0.9949 0.9767
0.1925 16.9524 174 0.1338 0.9432 0.9413 0.9432 0.9410 0.8767 0.6737 0.7619 0.8893 0.8510 0.8697 0.9609 0.9929 0.9766
0.177 17.9286 184 0.1521 0.9395 0.9408 0.9395 0.9351 0.9608 0.5158 0.6712 0.8407 0.8902 0.8648 0.9646 0.9929 0.9786
0.1889 18.9048 194 0.1357 0.9447 0.9442 0.9447 0.9415 0.9508 0.6105 0.7436 0.8871 0.8627 0.8748 0.9583 0.9980 0.9777
0.1972 19.9762 205 0.1303 0.9462 0.9444 0.9462 0.9441 0.8701 0.7053 0.7791 0.9185 0.8392 0.8770 0.9582 0.9970 0.9772
0.1923 20.9524 215 0.1344 0.9403 0.9412 0.9403 0.9389 0.7308 0.8 0.7638 0.9431 0.7804 0.8541 0.9609 0.9949 0.9776
0.1859 21.9286 225 0.1228 0.9455 0.9431 0.9455 0.9434 0.8228 0.6842 0.7471 0.9076 0.8471 0.8763 0.9638 0.9960 0.9796
0.1784 22.9048 235 0.1194 0.9470 0.9452 0.9470 0.9448 0.8889 0.6737 0.7665 0.8907 0.8627 0.8765 0.9647 0.9949 0.9796
0.175 23.9762 246 0.1190 0.9432 0.9426 0.9432 0.9421 0.7449 0.7684 0.7565 0.9156 0.8078 0.8583 0.9685 0.9949 0.9815
0.1722 24.9524 256 0.1204 0.9477 0.9461 0.9477 0.9461 0.8068 0.7474 0.7760 0.9254 0.8275 0.8737 0.9648 0.9980 0.9811
0.1727 25.9286 266 0.1168 0.9492 0.9473 0.9492 0.9475 0.8519 0.7263 0.7841 0.9114 0.8471 0.8780 0.9657 0.9970 0.9811
0.1774 26.9048 276 0.1190 0.9507 0.9513 0.9507 0.9497 0.7822 0.8316 0.8061 0.9585 0.8157 0.8814 0.9657 0.9970 0.9811
0.1769 27.9762 287 0.1093 0.9500 0.9484 0.9500 0.9481 0.8904 0.6842 0.7738 0.892 0.8745 0.8832 0.9685 0.9949 0.9815
0.1536 28.9524 297 0.1069 0.9507 0.9496 0.9507 0.9484 0.9275 0.6737 0.7805 0.9024 0.8706 0.8862 0.9639 0.9980 0.9806
0.1625 29.9286 307 0.1022 0.9559 0.9549 0.9559 0.9544 0.9211 0.7368 0.8187 0.9253 0.8745 0.8992 0.9658 0.9980 0.9816
0.1596 30.9048 317 0.1000 0.9552 0.9539 0.9552 0.9542 0.8409 0.7789 0.8087 0.9205 0.8627 0.8907 0.9733 0.9960 0.9845
0.166 31.9762 328 0.1028 0.9537 0.9522 0.9537 0.9526 0.8372 0.7579 0.7956 0.9095 0.8667 0.8876 0.9743 0.9949 0.9845
0.1507 32.9524 338 0.1034 0.9500 0.9498 0.9500 0.9495 0.77 0.8105 0.7897 0.9149 0.8431 0.8776 0.9761 0.9909 0.9834
0.1603 33.9286 348 0.0991 0.9515 0.9499 0.9515 0.9502 0.8554 0.7474 0.7978 0.8980 0.8627 0.8800 0.9723 0.9939 0.9830
0.1431 34.9048 358 0.1069 0.9507 0.9542 0.9507 0.9492 0.9836 0.6316 0.7692 0.8362 0.9412 0.8856 0.9818 0.9838 0.9828
0.1504 35.9762 369 0.1035 0.9544 0.9562 0.9544 0.9522 0.9833 0.6211 0.7613 0.8613 0.9255 0.8922 0.9781 0.9939 0.9860
0.1429 36.9524 379 0.0987 0.9567 0.9574 0.9567 0.9547 0.9531 0.6421 0.7673 0.8713 0.9294 0.8994 0.9801 0.9939 0.9869
0.1471 37.9286 389 0.1180 0.9507 0.9542 0.9507 0.9470 0.9804 0.5263 0.6849 0.8368 0.9451 0.8877 0.982 0.9929 0.9874
0.1421 38.9048 399 0.1058 0.9507 0.9562 0.9507 0.9492 0.9831 0.6105 0.7532 0.82 0.9647 0.8865 0.9888 0.9798 0.9843
0.1342 39.9762 410 0.0953 0.9567 0.9586 0.9567 0.9546 0.9836 0.6316 0.7692 0.8664 0.9412 0.9023 0.9800 0.9919 0.9859
0.1434 40.9524 420 0.0937 0.9612 0.9648 0.9612 0.9596 0.9839 0.6421 0.7771 0.8527 0.9765 0.9104 0.9919 0.9879 0.9899
0.1401 41.9286 430 0.0875 0.9619 0.9609 0.9619 0.9612 0.8523 0.7895 0.8197 0.9268 0.8941 0.9102 0.9801 0.9960 0.9880
0.1342 42.9048 440 0.0875 0.9597 0.9586 0.9597 0.9587 0.8765 0.7474 0.8068 0.9059 0.9059 0.9059 0.9801 0.9939 0.9869
0.1363 43.9762 451 0.1002 0.9597 0.9611 0.9597 0.9574 0.9836 0.6316 0.7692 0.8791 0.9412 0.9091 0.9801 0.9960 0.9880
0.1375 44.9524 461 0.1123 0.9552 0.9560 0.9552 0.9523 0.9831 0.6105 0.7532 0.8893 0.9137 0.9014 0.9705 0.9990 0.9846
0.1373 45.9286 471 0.1053 0.9567 0.9579 0.9567 0.9544 0.9836 0.6316 0.7692 0.8773 0.9255 0.9008 0.9762 0.9960 0.9860
0.135 46.9048 481 0.0947 0.9589 0.9615 0.9589 0.9573 0.9683 0.6421 0.7722 0.8566 0.9608 0.9057 0.9879 0.9889 0.9884
0.1319 47.9762 492 0.0995 0.9567 0.9601 0.9567 0.9548 0.9833 0.6211 0.7613 0.8478 0.9608 0.9007 0.9869 0.9879 0.9874
0.1318 48.9524 502 0.0882 0.9604 0.9604 0.9604 0.9599 0.8875 0.7474 0.8114 0.8810 0.9294 0.9046 0.9879 0.9889 0.9884
0.134 49.9286 512 0.0872 0.9589 0.9587 0.9589 0.9583 0.8765 0.7474 0.8068 0.8797 0.9176 0.8983 0.9869 0.9899 0.9884
0.1233 50.9048 522 0.0858 0.9627 0.9651 0.9627 0.9618 0.9565 0.6947 0.8049 0.8606 0.9686 0.9114 0.9929 0.9869 0.9899
0.1277 51.9762 533 0.0905 0.9612 0.9630 0.9612 0.9605 0.9178 0.7053 0.7976 0.8601 0.9647 0.9094 0.9939 0.9848 0.9893
0.1301 52.9524 543 0.0870 0.9634 0.9645 0.9634 0.9628 0.92 0.7263 0.8118 0.875 0.9608 0.9159 0.9919 0.9869 0.9894
0.1237 53.9286 553 0.0947 0.9619 0.9626 0.9619 0.9605 0.9420 0.6842 0.7927 0.8836 0.9529 0.9170 0.9849 0.9909 0.9879
0.1244 54.9048 563 0.0936 0.9604 0.9611 0.9604 0.9590 0.9559 0.6842 0.7975 0.8856 0.9412 0.9125 0.981 0.9919 0.9864
0.1247 55.9762 574 0.0946 0.9634 0.9648 0.9634 0.9629 0.9211 0.7368 0.8187 0.8723 0.9647 0.9162 0.9929 0.9848 0.9888
0.1179 56.9524 584 0.0930 0.9656 0.9677 0.9656 0.9644 0.9701 0.6842 0.8025 0.8768 0.9765 0.9239 0.9909 0.9899 0.9904
0.1249 57.9286 594 0.0906 0.9634 0.9647 0.9634 0.9623 0.9429 0.6947 0.8000 0.875 0.9608 0.9159 0.9899 0.9899 0.9899
0.1258 58.9048 604 0.0866 0.9627 0.9628 0.9627 0.9621 0.8875 0.7474 0.8114 0.8856 0.9412 0.9125 0.9899 0.9889 0.9894
0.1168 59.9762 615 0.0886 0.9574 0.9586 0.9574 0.9574 0.8452 0.7474 0.7933 0.8602 0.9412 0.8989 0.9949 0.9818 0.9883
0.1267 60.9524 625 0.0951 0.9619 0.9624 0.9619 0.9605 0.9420 0.6842 0.7927 0.8864 0.9490 0.9167 0.9840 0.9919 0.9879
0.1211 61.9286 635 0.0914 0.9612 0.9622 0.9612 0.9597 0.9552 0.6737 0.7901 0.8804 0.9529 0.9153 0.9839 0.9909 0.9874
0.1258 62.9048 645 0.0857 0.9649 0.9659 0.9649 0.9640 0.9444 0.7158 0.8144 0.8845 0.9608 0.9211 0.9889 0.9899 0.9894
0.1165 63.9762 656 0.0831 0.9679 0.9681 0.9679 0.9675 0.9012 0.7684 0.8295 0.8971 0.9569 0.9260 0.9929 0.9899 0.9914
0.1166 64.9524 666 0.0841 0.9634 0.9634 0.9634 0.9634 0.8065 0.7895 0.7979 0.9073 0.9216 0.9144 0.9929 0.9909 0.9919
0.117 65.9286 676 0.0885 0.9627 0.9643 0.9627 0.9615 0.9420 0.6842 0.7927 0.8693 0.9647 0.9145 0.9909 0.9889 0.9899
0.1173 66.9048 686 0.0879 0.9656 0.9657 0.9656 0.9646 0.9452 0.7263 0.8214 0.9064 0.9490 0.9272 0.9830 0.9929 0.9879
0.1133 67.9762 697 0.0850 0.9664 0.9681 0.9664 0.9656 0.9444 0.7158 0.8144 0.8768 0.9765 0.9239 0.9939 0.9879 0.9909
0.1186 68.9524 707 0.0872 0.9642 0.9657 0.9642 0.9634 0.9315 0.7158 0.8095 0.8728 0.9686 0.9182 0.9929 0.9869 0.9899
0.1181 69.9286 717 0.0842 0.9694 0.9704 0.9694 0.9687 0.9333 0.7368 0.8235 0.8893 0.9765 0.9308 0.9949 0.9899 0.9924
0.1062 70.9048 727 0.0856 0.9686 0.9693 0.9686 0.9680 0.9342 0.7474 0.8304 0.8945 0.9647 0.9283 0.9919 0.9909 0.9914
0.1159 71.9762 738 0.0878 0.9656 0.9682 0.9656 0.9648 0.9565 0.6947 0.8049 0.8651 0.9804 0.9191 0.9959 0.9879 0.9919
0.1162 72.9524 748 0.0851 0.9642 0.9644 0.9642 0.9643 0.8021 0.8105 0.8063 0.9141 0.9176 0.9159 0.9929 0.9909 0.9919
0.1201 73.9286 758 0.0828 0.9694 0.9703 0.9694 0.9685 0.9452 0.7263 0.8214 0.8957 0.9765 0.9343 0.9919 0.9909 0.9914
0.1145 74.9048 768 0.0865 0.9656 0.9670 0.9656 0.9642 0.9552 0.6737 0.7901 0.8826 0.9725 0.9254 0.9899 0.9919 0.9909
0.1172 75.9762 779 0.0835 0.9694 0.9693 0.9694 0.9692 0.8539 0.8 0.8261 0.9129 0.9451 0.9287 0.9949 0.9919 0.9934
0.1077 76.9524 789 0.0896 0.9679 0.9695 0.9679 0.9669 0.9571 0.7053 0.8121 0.8834 0.9804 0.9294 0.9929 0.9899 0.9914
0.1093 77.9286 799 0.0808 0.9686 0.9696 0.9686 0.9681 0.9221 0.7474 0.8256 0.8889 0.9725 0.9288 0.9949 0.9889 0.9919
0.1114 78.9048 809 0.0823 0.9642 0.9651 0.9642 0.9639 0.8889 0.7579 0.8182 0.8781 0.9608 0.9176 0.9949 0.9848 0.9898
0.1144 79.9762 820 0.0867 0.9686 0.9705 0.9686 0.9675 0.9706 0.6947 0.8098 0.8838 0.9843 0.9314 0.9929 0.9909 0.9919
0.1101 80.9524 830 0.0786 0.9679 0.9675 0.9679 0.9673 0.8889 0.7579 0.8182 0.9101 0.9529 0.9310 0.9899 0.9919 0.9909
0.1053 81.9286 840 0.0874 0.9679 0.9697 0.9679 0.9669 0.9571 0.7053 0.8121 0.8807 0.9843 0.9296 0.9939 0.9889 0.9914
0.1019 82.9048 850 0.0842 0.9642 0.9660 0.9642 0.9640 0.9114 0.7579 0.8276 0.8702 0.9725 0.9185 0.9959 0.9818 0.9888
0.1086 83.9762 861 0.0864 0.9694 0.9708 0.9694 0.9684 0.9577 0.7158 0.8193 0.8901 0.9843 0.9348 0.9929 0.9899 0.9914
0.1108 84.9524 871 0.0808 0.9634 0.9646 0.9634 0.9630 0.8974 0.7368 0.8092 0.8723 0.9647 0.9162 0.9949 0.9848 0.9898
0.1017 85.9286 881 0.0825 0.9686 0.9704 0.9686 0.9677 0.9577 0.7158 0.8193 0.8838 0.9843 0.9314 0.9939 0.9889 0.9914
0.1108 86.9048 891 0.0800 0.9694 0.9700 0.9694 0.9688 0.9221 0.7474 0.8256 0.8953 0.9725 0.9323 0.9939 0.9899 0.9919
0.1081 87.9762 902 0.0867 0.9679 0.9702 0.9679 0.9669 0.9710 0.7053 0.8171 0.8780 0.9882 0.9299 0.9939 0.9879 0.9909
0.1076 88.9524 912 0.0789 0.9716 0.9720 0.9716 0.9708 0.9333 0.7368 0.8235 0.9055 0.9765 0.9396 0.9929 0.9929 0.9929
0.1059 89.9286 922 0.0829 0.9694 0.9708 0.9694 0.9685 0.9583 0.7263 0.8263 0.8897 0.9804 0.9328 0.9929 0.9899 0.9914
0.1048 90.9048 932 0.0771 0.9716 0.9718 0.9716 0.9709 0.9221 0.7474 0.8256 0.9084 0.9725 0.9394 0.9929 0.9929 0.9929
0.098 91.9762 943 0.0796 0.9709 0.9715 0.9709 0.9701 0.9333 0.7368 0.8235 0.8986 0.9725 0.9341 0.9939 0.9929 0.9934
0.1075 92.9524 953 0.0798 0.9679 0.9689 0.9679 0.9673 0.9211 0.7368 0.8187 0.8857 0.9725 0.9271 0.9949 0.9889 0.9919
0.0937 93.9286 963 0.0777 0.9701 0.9709 0.9701 0.9694 0.9333 0.7368 0.8235 0.8957 0.9765 0.9343 0.9939 0.9909 0.9924
0.1099 94.9048 973 0.0760 0.9679 0.9685 0.9679 0.9673 0.9103 0.7474 0.8208 0.8917 0.9686 0.9286 0.9939 0.9889 0.9914
0.1043 95.9762 984 0.0763 0.9701 0.9710 0.9701 0.9695 0.9342 0.7474 0.8304 0.8957 0.9765 0.9343 0.9939 0.9899 0.9919
0.1013 96.9524 994 0.0774 0.9686 0.9697 0.9686 0.9679 0.9333 0.7368 0.8235 0.8893 0.9765 0.9308 0.9939 0.9889 0.9914
0.1097 97.5476 1000 0.0776 0.9686 0.9697 0.9686 0.9679 0.9333 0.7368 0.8235 0.8893 0.9765 0.9308 0.9939 0.9889 0.9914

Framework versions

  • Transformers 4.46.3
  • Pytorch 2.4.1+cu121
  • Datasets 3.1.0
  • Tokenizers 0.20.3
Downloads last month
6
Safetensors
Model size
27.6M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for Ayushij074/swinv2-tiny-patch4-window8-256-finetuned-eurosat

Finetuned
(120)
this model