Melo1512 commited on
Commit
0b3ba33
·
verified ·
1 Parent(s): f45bfb4

Model save

Browse files
README.md ADDED
@@ -0,0 +1,171 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ license: apache-2.0
4
+ base_model: facebook/vit-msn-small
5
+ tags:
6
+ - generated_from_trainer
7
+ datasets:
8
+ - imagefolder
9
+ metrics:
10
+ - accuracy
11
+ model-index:
12
+ - name: vit-msn-small-lateral_flow_ivalidation_train_test_6
13
+ results:
14
+ - task:
15
+ name: Image Classification
16
+ type: image-classification
17
+ dataset:
18
+ name: imagefolder
19
+ type: imagefolder
20
+ config: default
21
+ split: test
22
+ args: default
23
+ metrics:
24
+ - name: Accuracy
25
+ type: accuracy
26
+ value: 0.8717948717948718
27
+ ---
28
+
29
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
30
+ should probably proofread and complete it, then remove this comment. -->
31
+
32
+ # vit-msn-small-lateral_flow_ivalidation_train_test_6
33
+
34
+ This model is a fine-tuned version of [facebook/vit-msn-small](https://huggingface.co/facebook/vit-msn-small) on the imagefolder dataset.
35
+ It achieves the following results on the evaluation set:
36
+ - Loss: 0.4172
37
+ - Accuracy: 0.8718
38
+
39
+ ## Model description
40
+
41
+ More information needed
42
+
43
+ ## Intended uses & limitations
44
+
45
+ More information needed
46
+
47
+ ## Training and evaluation data
48
+
49
+ More information needed
50
+
51
+ ## Training procedure
52
+
53
+ ### Training hyperparameters
54
+
55
+ The following hyperparameters were used during training:
56
+ - learning_rate: 5e-07
57
+ - train_batch_size: 64
58
+ - eval_batch_size: 64
59
+ - seed: 42
60
+ - gradient_accumulation_steps: 2
61
+ - total_train_batch_size: 128
62
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
63
+ - lr_scheduler_type: linear
64
+ - lr_scheduler_warmup_ratio: 0.3
65
+ - num_epochs: 100
66
+
67
+ ### Training results
68
+
69
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy |
70
+ |:-------------:|:-------:|:----:|:---------------:|:--------:|
71
+ | 0.6672 | 0.9231 | 6 | 0.6980 | 0.4212 |
72
+ | 0.6617 | 2.0 | 13 | 0.6965 | 0.4249 |
73
+ | 0.6699 | 2.9231 | 19 | 0.6944 | 0.4396 |
74
+ | 0.662 | 4.0 | 26 | 0.6910 | 0.4396 |
75
+ | 0.6548 | 4.9231 | 32 | 0.6873 | 0.4579 |
76
+ | 0.6541 | 6.0 | 39 | 0.6825 | 0.4835 |
77
+ | 0.6222 | 6.9231 | 45 | 0.6777 | 0.5311 |
78
+ | 0.6555 | 8.0 | 52 | 0.6719 | 0.5421 |
79
+ | 0.6226 | 8.9231 | 58 | 0.6665 | 0.5861 |
80
+ | 0.5989 | 10.0 | 65 | 0.6603 | 0.6154 |
81
+ | 0.5754 | 10.9231 | 71 | 0.6555 | 0.6264 |
82
+ | 0.6251 | 12.0 | 78 | 0.6493 | 0.6484 |
83
+ | 0.5796 | 12.9231 | 84 | 0.6446 | 0.6667 |
84
+ | 0.5763 | 14.0 | 91 | 0.6390 | 0.6667 |
85
+ | 0.5952 | 14.9231 | 97 | 0.6333 | 0.6850 |
86
+ | 0.5675 | 16.0 | 104 | 0.6269 | 0.7033 |
87
+ | 0.5453 | 16.9231 | 110 | 0.6211 | 0.7106 |
88
+ | 0.5199 | 18.0 | 117 | 0.6150 | 0.7143 |
89
+ | 0.541 | 18.9231 | 123 | 0.6090 | 0.7216 |
90
+ | 0.5273 | 20.0 | 130 | 0.6007 | 0.7289 |
91
+ | 0.495 | 20.9231 | 136 | 0.5934 | 0.7289 |
92
+ | 0.4855 | 22.0 | 143 | 0.5855 | 0.7473 |
93
+ | 0.4763 | 22.9231 | 149 | 0.5787 | 0.7363 |
94
+ | 0.4287 | 24.0 | 156 | 0.5693 | 0.7509 |
95
+ | 0.445 | 24.9231 | 162 | 0.5619 | 0.7692 |
96
+ | 0.4343 | 26.0 | 169 | 0.5540 | 0.7802 |
97
+ | 0.3748 | 26.9231 | 175 | 0.5467 | 0.7875 |
98
+ | 0.4041 | 28.0 | 182 | 0.5421 | 0.8022 |
99
+ | 0.3543 | 28.9231 | 188 | 0.5291 | 0.8205 |
100
+ | 0.3972 | 30.0 | 195 | 0.5134 | 0.8278 |
101
+ | 0.3716 | 30.9231 | 201 | 0.5150 | 0.8242 |
102
+ | 0.3871 | 32.0 | 208 | 0.5100 | 0.8315 |
103
+ | 0.3729 | 32.9231 | 214 | 0.4986 | 0.8352 |
104
+ | 0.3286 | 34.0 | 221 | 0.4946 | 0.8462 |
105
+ | 0.4261 | 34.9231 | 227 | 0.4957 | 0.8388 |
106
+ | 0.4014 | 36.0 | 234 | 0.4850 | 0.8535 |
107
+ | 0.3514 | 36.9231 | 240 | 0.4807 | 0.8535 |
108
+ | 0.3883 | 38.0 | 247 | 0.4767 | 0.8535 |
109
+ | 0.3219 | 38.9231 | 253 | 0.4763 | 0.8535 |
110
+ | 0.4351 | 40.0 | 260 | 0.4738 | 0.8571 |
111
+ | 0.3068 | 40.9231 | 266 | 0.4688 | 0.8645 |
112
+ | 0.3356 | 42.0 | 273 | 0.4585 | 0.8645 |
113
+ | 0.345 | 42.9231 | 279 | 0.4541 | 0.8681 |
114
+ | 0.3254 | 44.0 | 286 | 0.4584 | 0.8645 |
115
+ | 0.3164 | 44.9231 | 292 | 0.4592 | 0.8571 |
116
+ | 0.3657 | 46.0 | 299 | 0.4534 | 0.8608 |
117
+ | 0.2655 | 46.9231 | 305 | 0.4502 | 0.8645 |
118
+ | 0.2981 | 48.0 | 312 | 0.4452 | 0.8645 |
119
+ | 0.3508 | 48.9231 | 318 | 0.4371 | 0.8791 |
120
+ | 0.3419 | 50.0 | 325 | 0.4394 | 0.8755 |
121
+ | 0.2668 | 50.9231 | 331 | 0.4430 | 0.8755 |
122
+ | 0.2972 | 52.0 | 338 | 0.4395 | 0.8718 |
123
+ | 0.3514 | 52.9231 | 344 | 0.4371 | 0.8755 |
124
+ | 0.3012 | 54.0 | 351 | 0.4330 | 0.8791 |
125
+ | 0.2725 | 54.9231 | 357 | 0.4298 | 0.8791 |
126
+ | 0.2547 | 56.0 | 364 | 0.4289 | 0.8718 |
127
+ | 0.2896 | 56.9231 | 370 | 0.4282 | 0.8718 |
128
+ | 0.3469 | 58.0 | 377 | 0.4273 | 0.8718 |
129
+ | 0.3528 | 58.9231 | 383 | 0.4269 | 0.8718 |
130
+ | 0.2552 | 60.0 | 390 | 0.4324 | 0.8681 |
131
+ | 0.239 | 60.9231 | 396 | 0.4319 | 0.8645 |
132
+ | 0.3321 | 62.0 | 403 | 0.4270 | 0.8718 |
133
+ | 0.3115 | 62.9231 | 409 | 0.4184 | 0.8718 |
134
+ | 0.306 | 64.0 | 416 | 0.4169 | 0.8718 |
135
+ | 0.3086 | 64.9231 | 422 | 0.4176 | 0.8718 |
136
+ | 0.4256 | 66.0 | 429 | 0.4196 | 0.8718 |
137
+ | 0.2798 | 66.9231 | 435 | 0.4219 | 0.8718 |
138
+ | 0.3016 | 68.0 | 442 | 0.4224 | 0.8718 |
139
+ | 0.2791 | 68.9231 | 448 | 0.4207 | 0.8718 |
140
+ | 0.2651 | 70.0 | 455 | 0.4189 | 0.8718 |
141
+ | 0.2466 | 70.9231 | 461 | 0.4178 | 0.8718 |
142
+ | 0.1913 | 72.0 | 468 | 0.4177 | 0.8718 |
143
+ | 0.2719 | 72.9231 | 474 | 0.4164 | 0.8718 |
144
+ | 0.3364 | 74.0 | 481 | 0.4166 | 0.8718 |
145
+ | 0.283 | 74.9231 | 487 | 0.4179 | 0.8755 |
146
+ | 0.2891 | 76.0 | 494 | 0.4174 | 0.8755 |
147
+ | 0.2625 | 76.9231 | 500 | 0.4180 | 0.8755 |
148
+ | 0.2843 | 78.0 | 507 | 0.4184 | 0.8718 |
149
+ | 0.375 | 78.9231 | 513 | 0.4167 | 0.8755 |
150
+ | 0.3107 | 80.0 | 520 | 0.4150 | 0.8755 |
151
+ | 0.3742 | 80.9231 | 526 | 0.4145 | 0.8718 |
152
+ | 0.2574 | 82.0 | 533 | 0.4145 | 0.8755 |
153
+ | 0.329 | 82.9231 | 539 | 0.4149 | 0.8755 |
154
+ | 0.2727 | 84.0 | 546 | 0.4145 | 0.8755 |
155
+ | 0.2977 | 84.9231 | 552 | 0.4149 | 0.8755 |
156
+ | 0.2611 | 86.0 | 559 | 0.4160 | 0.8718 |
157
+ | 0.2542 | 86.9231 | 565 | 0.4170 | 0.8718 |
158
+ | 0.2665 | 88.0 | 572 | 0.4171 | 0.8718 |
159
+ | 0.2654 | 88.9231 | 578 | 0.4170 | 0.8718 |
160
+ | 0.3059 | 90.0 | 585 | 0.4172 | 0.8718 |
161
+ | 0.2377 | 90.9231 | 591 | 0.4173 | 0.8718 |
162
+ | 0.2896 | 92.0 | 598 | 0.4172 | 0.8718 |
163
+ | 0.3133 | 92.3077 | 600 | 0.4172 | 0.8718 |
164
+
165
+
166
+ ### Framework versions
167
+
168
+ - Transformers 4.44.2
169
+ - Pytorch 2.4.1+cu121
170
+ - Datasets 3.2.0
171
+ - Tokenizers 0.19.1
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:d593997d5fcfbd3a8a92ef64aadef7f45afeba184a52a91fffb19f7b77a61166
3
  size 86688624
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6f541da83f2906b268c6a4a6ecc8ec612e021565ebd051be0f064260b90f1f4b
3
  size 86688624
runs/Jan16_16-38-40_d2e708f84219/events.out.tfevents.1737045527.d2e708f84219.310.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:1dee48a15fc46446ed1ea8b358d4329f86cc29aa58bbf8a606d0266d49acc9a4
3
- size 160697
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b9470d5307886073e95ac41d53d7b61a3406772322da04d2e31113c96620dfcc
3
+ size 161374