Melo1512's picture
End of training
56f39b7 verified
metadata
library_name: transformers
license: apache-2.0
base_model: Melo1512/vit-msn-small-lateral_flow_ivalidation_train_test
tags:
  - generated_from_trainer
datasets:
  - imagefolder
metrics:
  - accuracy
model-index:
  - name: vit-msn-small-lateral_flow_ivalidation_train_test
    results:
      - task:
          name: Image Classification
          type: image-classification
        dataset:
          name: imagefolder
          type: imagefolder
          config: default
          split: test
          args: default
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.9120879120879121

vit-msn-small-lateral_flow_ivalidation_train_test

This model is a fine-tuned version of Melo1512/vit-msn-small-lateral_flow_ivalidation_train_test on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3718
  • Accuracy: 0.9121

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-06
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 256
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.05
  • num_epochs: 100
  • label_smoothing_factor: 0.1

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.4023 0.9231 3 0.4126 0.8938
0.4055 1.8462 6 0.4096 0.8974
0.4112 2.7692 9 0.4275 0.8974
0.4246 4.0 13 0.3958 0.9048
0.376 4.9231 16 0.3980 0.9121
0.4226 5.8462 19 0.4029 0.9084
0.3751 6.7692 22 0.3749 0.9048
0.4135 8.0 26 0.3757 0.9121
0.3673 8.9231 29 0.4174 0.8901
0.3749 9.8462 32 0.4077 0.9048
0.4119 10.7692 35 0.4181 0.8901
0.3946 12.0 39 0.4189 0.8901
0.3335 12.9231 42 0.4029 0.9084
0.3717 13.8462 45 0.3963 0.9011
0.3493 14.7692 48 0.3797 0.9158
0.3686 16.0 52 0.3761 0.9121
0.3999 16.9231 55 0.3774 0.9158
0.3221 17.8462 58 0.3757 0.9158
0.3902 18.7692 61 0.3774 0.9121
0.3649 20.0 65 0.3962 0.9011
0.3553 20.9231 68 0.3718 0.9121
0.3761 21.8462 71 0.3934 0.9121
0.3422 22.7692 74 0.4271 0.8828
0.3247 24.0 78 0.3727 0.9194
0.3417 24.9231 81 0.3793 0.9121
0.3499 25.8462 84 0.4293 0.8791
0.3397 26.7692 87 0.4216 0.8901
0.346 28.0 91 0.4001 0.8901
0.3337 28.9231 94 0.4168 0.8864
0.3268 29.8462 97 0.4123 0.8938
0.3274 30.7692 100 0.4187 0.8828
0.3757 32.0 104 0.4026 0.8974
0.3727 32.9231 107 0.4021 0.8938
0.3431 33.8462 110 0.4024 0.9011
0.3626 34.7692 113 0.4200 0.8901
0.3381 36.0 117 0.4080 0.8938
0.3411 36.9231 120 0.4279 0.8791
0.3229 37.8462 123 0.4422 0.8718
0.3736 38.7692 126 0.4285 0.8791
0.4145 40.0 130 0.4402 0.8718
0.3456 40.9231 133 0.4226 0.8828
0.3567 41.8462 136 0.4113 0.8901
0.339 42.7692 139 0.4445 0.8645
0.3142 44.0 143 0.4204 0.8791
0.3461 44.9231 146 0.4006 0.8974
0.3583 45.8462 149 0.3991 0.9011
0.3651 46.7692 152 0.4293 0.8681
0.3098 48.0 156 0.4082 0.8901
0.375 48.9231 159 0.4095 0.8864
0.3435 49.8462 162 0.4529 0.8498
0.3452 50.7692 165 0.4440 0.8608
0.3316 52.0 169 0.4181 0.8791
0.3344 52.9231 172 0.4609 0.8535
0.3377 53.8462 175 0.4775 0.8278
0.3455 54.7692 178 0.4396 0.8681
0.3202 56.0 182 0.4384 0.8755
0.3119 56.9231 185 0.4573 0.8535
0.3633 57.8462 188 0.4469 0.8645
0.3025 58.7692 191 0.4437 0.8608
0.3094 60.0 195 0.4472 0.8571
0.3306 60.9231 198 0.4396 0.8681
0.3266 61.8462 201 0.4486 0.8681
0.3495 62.7692 204 0.4658 0.8352
0.3066 64.0 208 0.4754 0.8315
0.3384 64.9231 211 0.4518 0.8608
0.3151 65.8462 214 0.4614 0.8535
0.3233 66.7692 217 0.4638 0.8425
0.3416 68.0 221 0.4741 0.8315
0.3326 68.9231 224 0.4679 0.8425
0.331 69.8462 227 0.4754 0.8315
0.3595 70.7692 230 0.4603 0.8498
0.3107 72.0 234 0.4412 0.8571
0.3126 72.9231 237 0.4578 0.8571
0.3205 73.8462 240 0.4820 0.8242
0.3296 74.7692 243 0.5048 0.7985
0.3246 76.0 247 0.4792 0.8278
0.3065 76.9231 250 0.4842 0.8242
0.282 77.8462 253 0.5049 0.7912
0.3272 78.7692 256 0.5088 0.7875
0.325 80.0 260 0.4933 0.8132
0.3524 80.9231 263 0.4893 0.8132
0.3019 81.8462 266 0.4864 0.8132
0.3095 82.7692 269 0.4875 0.8132
0.3254 84.0 273 0.4910 0.8059
0.3158 84.9231 276 0.4918 0.8059
0.3114 85.8462 279 0.4936 0.8059
0.3348 86.7692 282 0.4996 0.7985
0.3078 88.0 286 0.5043 0.7949
0.3096 88.9231 289 0.5047 0.7949
0.2827 89.8462 292 0.5054 0.7949
0.3249 90.7692 295 0.5040 0.7949
0.3277 92.0 299 0.5031 0.7985
0.3522 92.3077 300 0.5030 0.7985

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.1+cu121
  • Datasets 3.2.0
  • Tokenizers 0.19.1