Melo1512 commited on
Commit
59783b8
·
verified ·
1 Parent(s): 612b947

Model save

Browse files
README.md ADDED
@@ -0,0 +1,171 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ license: apache-2.0
4
+ base_model: facebook/vit-msn-small
5
+ tags:
6
+ - generated_from_trainer
7
+ datasets:
8
+ - imagefolder
9
+ metrics:
10
+ - accuracy
11
+ model-index:
12
+ - name: vit-msn-small-corect_cleaned_dataset_lateral_flow_ivalidation
13
+ results:
14
+ - task:
15
+ name: Image Classification
16
+ type: image-classification
17
+ dataset:
18
+ name: imagefolder
19
+ type: imagefolder
20
+ config: default
21
+ split: validation
22
+ args: default
23
+ metrics:
24
+ - name: Accuracy
25
+ type: accuracy
26
+ value: 0.7764932562620424
27
+ ---
28
+
29
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
30
+ should probably proofread and complete it, then remove this comment. -->
31
+
32
+ # vit-msn-small-corect_cleaned_dataset_lateral_flow_ivalidation
33
+
34
+ This model is a fine-tuned version of [facebook/vit-msn-small](https://huggingface.co/facebook/vit-msn-small) on the imagefolder dataset.
35
+ It achieves the following results on the evaluation set:
36
+ - Loss: 0.6359
37
+ - Accuracy: 0.7765
38
+
39
+ ## Model description
40
+
41
+ More information needed
42
+
43
+ ## Intended uses & limitations
44
+
45
+ More information needed
46
+
47
+ ## Training and evaluation data
48
+
49
+ More information needed
50
+
51
+ ## Training procedure
52
+
53
+ ### Training hyperparameters
54
+
55
+ The following hyperparameters were used during training:
56
+ - learning_rate: 5e-05
57
+ - train_batch_size: 64
58
+ - eval_batch_size: 64
59
+ - seed: 42
60
+ - gradient_accumulation_steps: 4
61
+ - total_train_batch_size: 256
62
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
63
+ - lr_scheduler_type: linear
64
+ - lr_scheduler_warmup_ratio: 0.1
65
+ - num_epochs: 100
66
+
67
+ ### Training results
68
+
69
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy |
70
+ |:-------------:|:-------:|:----:|:---------------:|:--------:|
71
+ | No log | 0.9231 | 3 | 0.8617 | 0.1493 |
72
+ | No log | 1.8462 | 6 | 0.4977 | 0.8796 |
73
+ | No log | 2.7692 | 9 | 0.6143 | 0.7274 |
74
+ | 0.6181 | 4.0 | 13 | 0.4685 | 0.8324 |
75
+ | 0.6181 | 4.9231 | 16 | 0.3828 | 0.8622 |
76
+ | 0.6181 | 5.8462 | 19 | 0.4028 | 0.8497 |
77
+ | 0.3645 | 6.7692 | 22 | 0.2485 | 0.9210 |
78
+ | 0.3645 | 8.0 | 26 | 0.2426 | 0.9123 |
79
+ | 0.3645 | 8.9231 | 29 | 0.5674 | 0.7775 |
80
+ | 0.3492 | 9.8462 | 32 | 0.1610 | 0.9489 |
81
+ | 0.3492 | 10.7692 | 35 | 0.3225 | 0.9171 |
82
+ | 0.3492 | 12.0 | 39 | 0.2904 | 0.9123 |
83
+ | 0.3472 | 12.9231 | 42 | 0.2851 | 0.9133 |
84
+ | 0.3472 | 13.8462 | 45 | 0.3330 | 0.8931 |
85
+ | 0.3472 | 14.7692 | 48 | 0.6116 | 0.7389 |
86
+ | 0.2838 | 16.0 | 52 | 0.2677 | 0.9046 |
87
+ | 0.2838 | 16.9231 | 55 | 0.3216 | 0.8825 |
88
+ | 0.2838 | 17.8462 | 58 | 0.2165 | 0.9258 |
89
+ | 0.292 | 18.7692 | 61 | 0.4377 | 0.8333 |
90
+ | 0.292 | 20.0 | 65 | 0.3592 | 0.8699 |
91
+ | 0.292 | 20.9231 | 68 | 0.5011 | 0.7987 |
92
+ | 0.2809 | 21.8462 | 71 | 0.2319 | 0.9162 |
93
+ | 0.2809 | 22.7692 | 74 | 0.4018 | 0.8449 |
94
+ | 0.2809 | 24.0 | 78 | 0.4851 | 0.7996 |
95
+ | 0.251 | 24.9231 | 81 | 0.4668 | 0.8276 |
96
+ | 0.251 | 25.8462 | 84 | 0.4974 | 0.8179 |
97
+ | 0.251 | 26.7692 | 87 | 0.5482 | 0.7890 |
98
+ | 0.2371 | 28.0 | 91 | 0.6840 | 0.7370 |
99
+ | 0.2371 | 28.9231 | 94 | 0.3629 | 0.8613 |
100
+ | 0.2371 | 29.8462 | 97 | 0.6212 | 0.7331 |
101
+ | 0.2416 | 30.7692 | 100 | 0.3657 | 0.8642 |
102
+ | 0.2416 | 32.0 | 104 | 0.5857 | 0.7649 |
103
+ | 0.2416 | 32.9231 | 107 | 0.3610 | 0.8565 |
104
+ | 0.2312 | 33.8462 | 110 | 0.8753 | 0.6358 |
105
+ | 0.2312 | 34.7692 | 113 | 0.4993 | 0.7977 |
106
+ | 0.2312 | 36.0 | 117 | 0.4702 | 0.8131 |
107
+ | 0.2131 | 36.9231 | 120 | 0.3648 | 0.8584 |
108
+ | 0.2131 | 37.8462 | 123 | 0.7660 | 0.7062 |
109
+ | 0.2131 | 38.7692 | 126 | 0.4444 | 0.8304 |
110
+ | 0.2248 | 40.0 | 130 | 0.7568 | 0.7206 |
111
+ | 0.2248 | 40.9231 | 133 | 0.6134 | 0.7746 |
112
+ | 0.2248 | 41.8462 | 136 | 0.3969 | 0.8372 |
113
+ | 0.2248 | 42.7692 | 139 | 0.6100 | 0.7428 |
114
+ | 0.2341 | 44.0 | 143 | 0.6376 | 0.7486 |
115
+ | 0.2341 | 44.9231 | 146 | 0.8082 | 0.6965 |
116
+ | 0.2341 | 45.8462 | 149 | 0.5552 | 0.7987 |
117
+ | 0.1998 | 46.7692 | 152 | 0.5736 | 0.7784 |
118
+ | 0.1998 | 48.0 | 156 | 0.4477 | 0.8179 |
119
+ | 0.1998 | 48.9231 | 159 | 0.4925 | 0.8064 |
120
+ | 0.2075 | 49.8462 | 162 | 0.6641 | 0.7408 |
121
+ | 0.2075 | 50.7692 | 165 | 0.6718 | 0.7418 |
122
+ | 0.2075 | 52.0 | 169 | 0.4913 | 0.8170 |
123
+ | 0.197 | 52.9231 | 172 | 0.5316 | 0.7967 |
124
+ | 0.197 | 53.8462 | 175 | 0.7917 | 0.7033 |
125
+ | 0.197 | 54.7692 | 178 | 0.8232 | 0.6850 |
126
+ | 0.1769 | 56.0 | 182 | 0.8841 | 0.6753 |
127
+ | 0.1769 | 56.9231 | 185 | 0.7670 | 0.7206 |
128
+ | 0.1769 | 57.8462 | 188 | 0.7893 | 0.7168 |
129
+ | 0.1735 | 58.7692 | 191 | 1.1965 | 0.6002 |
130
+ | 0.1735 | 60.0 | 195 | 1.0561 | 0.6570 |
131
+ | 0.1735 | 60.9231 | 198 | 0.7164 | 0.7408 |
132
+ | 0.1905 | 61.8462 | 201 | 0.6160 | 0.7611 |
133
+ | 0.1905 | 62.7692 | 204 | 0.4964 | 0.8006 |
134
+ | 0.1905 | 64.0 | 208 | 0.6949 | 0.7370 |
135
+ | 0.1748 | 64.9231 | 211 | 0.5145 | 0.8044 |
136
+ | 0.1748 | 65.8462 | 214 | 0.6397 | 0.7707 |
137
+ | 0.1748 | 66.7692 | 217 | 0.5984 | 0.7900 |
138
+ | 0.1535 | 68.0 | 221 | 0.4233 | 0.8459 |
139
+ | 0.1535 | 68.9231 | 224 | 0.4464 | 0.8343 |
140
+ | 0.1535 | 69.8462 | 227 | 0.3953 | 0.8497 |
141
+ | 0.1633 | 70.7692 | 230 | 0.4314 | 0.8314 |
142
+ | 0.1633 | 72.0 | 234 | 0.5035 | 0.8025 |
143
+ | 0.1633 | 72.9231 | 237 | 0.5387 | 0.7803 |
144
+ | 0.145 | 73.8462 | 240 | 0.5016 | 0.8025 |
145
+ | 0.145 | 74.7692 | 243 | 0.4606 | 0.8160 |
146
+ | 0.145 | 76.0 | 247 | 0.6732 | 0.7524 |
147
+ | 0.1584 | 76.9231 | 250 | 0.6854 | 0.7524 |
148
+ | 0.1584 | 77.8462 | 253 | 0.6868 | 0.7572 |
149
+ | 0.1584 | 78.7692 | 256 | 0.6765 | 0.7582 |
150
+ | 0.1423 | 80.0 | 260 | 0.6295 | 0.7832 |
151
+ | 0.1423 | 80.9231 | 263 | 0.6124 | 0.7909 |
152
+ | 0.1423 | 81.8462 | 266 | 0.6027 | 0.7881 |
153
+ | 0.1423 | 82.7692 | 269 | 0.6008 | 0.7861 |
154
+ | 0.1449 | 84.0 | 273 | 0.6533 | 0.7688 |
155
+ | 0.1449 | 84.9231 | 276 | 0.6304 | 0.7697 |
156
+ | 0.1449 | 85.8462 | 279 | 0.5607 | 0.7996 |
157
+ | 0.1452 | 86.7692 | 282 | 0.5739 | 0.7929 |
158
+ | 0.1452 | 88.0 | 286 | 0.6115 | 0.7765 |
159
+ | 0.1452 | 88.9231 | 289 | 0.6277 | 0.7726 |
160
+ | 0.1232 | 89.8462 | 292 | 0.6273 | 0.7784 |
161
+ | 0.1232 | 90.7692 | 295 | 0.6300 | 0.7775 |
162
+ | 0.1232 | 92.0 | 299 | 0.6361 | 0.7765 |
163
+ | 0.1494 | 92.3077 | 300 | 0.6359 | 0.7765 |
164
+
165
+
166
+ ### Framework versions
167
+
168
+ - Transformers 4.44.2
169
+ - Pytorch 2.4.1+cu121
170
+ - Datasets 3.2.0
171
+ - Tokenizers 0.19.1
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:110ffc77c9f2fc4bb5983f44e05702617ec0ab7034b51f67667070fc66582281
3
  size 86688624
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d0038b3fb518d4e9259091275cd792c64c96325147442f083e52d7fabf57b63b
3
  size 86688624
runs/Jan14_15-44-12_c583982b4f3d/events.out.tfevents.1736869459.c583982b4f3d.215.2 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:b4f7ea5ad8ac29499c7d4395620727670e001c48a58d149d7f4480d6d3d7bd39
3
- size 40587
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:8ae9d51432db4c90d055a1915857e7292d81307fbb9d56e2302551fdd327a3e4
3
+ size 41475