chiragtubakad commited on
Commit
26fdbf0
·
1 Parent(s): 6e78600

Model save

Browse files
README.md ADDED
@@ -0,0 +1,162 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: microsoft/swin-tiny-patch4-window7-224
4
+ tags:
5
+ - generated_from_trainer
6
+ metrics:
7
+ - accuracy
8
+ model-index:
9
+ - name: swin-tiny-patch4-window7-224-finetuned-icpr
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # swin-tiny-patch4-window7-224-finetuned-icpr
17
+
18
+ This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on an unknown dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 0.0987
21
+ - Accuracy: 0.9818
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 5e-05
41
+ - train_batch_size: 16
42
+ - eval_batch_size: 16
43
+ - seed: 42
44
+ - gradient_accumulation_steps: 4
45
+ - total_train_batch_size: 64
46
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
47
+ - lr_scheduler_type: linear
48
+ - lr_scheduler_warmup_ratio: 0.1
49
+ - num_epochs: 100
50
+
51
+ ### Training results
52
+
53
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy |
54
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|
55
+ | 1.1149 | 0.99 | 65 | 0.9590 | 0.7629 |
56
+ | 0.2653 | 2.0 | 131 | 0.1648 | 0.9532 |
57
+ | 0.1984 | 2.99 | 196 | 0.0894 | 0.9713 |
58
+ | 0.1719 | 4.0 | 262 | 0.0863 | 0.9685 |
59
+ | 0.1537 | 4.99 | 327 | 0.0810 | 0.9761 |
60
+ | 0.1162 | 6.0 | 393 | 0.0785 | 0.9771 |
61
+ | 0.1063 | 6.99 | 458 | 0.0835 | 0.9723 |
62
+ | 0.1392 | 8.0 | 524 | 0.0674 | 0.9761 |
63
+ | 0.1286 | 8.99 | 589 | 0.0788 | 0.9761 |
64
+ | 0.1294 | 10.0 | 655 | 0.0658 | 0.9790 |
65
+ | 0.0843 | 10.99 | 720 | 0.0735 | 0.9732 |
66
+ | 0.074 | 12.0 | 786 | 0.0636 | 0.9761 |
67
+ | 0.0734 | 12.99 | 851 | 0.1043 | 0.9751 |
68
+ | 0.0774 | 14.0 | 917 | 0.0898 | 0.9723 |
69
+ | 0.068 | 14.99 | 982 | 0.0719 | 0.9809 |
70
+ | 0.0821 | 16.0 | 1048 | 0.0956 | 0.9742 |
71
+ | 0.0576 | 16.99 | 1113 | 0.0725 | 0.9713 |
72
+ | 0.0652 | 18.0 | 1179 | 0.0957 | 0.9751 |
73
+ | 0.0712 | 18.99 | 1244 | 0.0809 | 0.9790 |
74
+ | 0.075 | 20.0 | 1310 | 0.1283 | 0.9675 |
75
+ | 0.0988 | 20.99 | 1375 | 0.0966 | 0.9742 |
76
+ | 0.0538 | 22.0 | 1441 | 0.1125 | 0.9761 |
77
+ | 0.0578 | 22.99 | 1506 | 0.0648 | 0.9828 |
78
+ | 0.0675 | 24.0 | 1572 | 0.0992 | 0.9799 |
79
+ | 0.0611 | 24.99 | 1637 | 0.0682 | 0.9818 |
80
+ | 0.0434 | 26.0 | 1703 | 0.0719 | 0.9809 |
81
+ | 0.0339 | 26.99 | 1768 | 0.0930 | 0.9780 |
82
+ | 0.0346 | 28.0 | 1834 | 0.0903 | 0.9799 |
83
+ | 0.0806 | 28.99 | 1899 | 0.0903 | 0.9799 |
84
+ | 0.0518 | 30.0 | 1965 | 0.0982 | 0.9790 |
85
+ | 0.0407 | 30.99 | 2030 | 0.0702 | 0.9828 |
86
+ | 0.0528 | 32.0 | 2096 | 0.0897 | 0.9761 |
87
+ | 0.0774 | 32.99 | 2161 | 0.0626 | 0.9818 |
88
+ | 0.053 | 34.0 | 2227 | 0.0576 | 0.9837 |
89
+ | 0.0512 | 34.99 | 2292 | 0.0707 | 0.9847 |
90
+ | 0.0388 | 36.0 | 2358 | 0.1040 | 0.9790 |
91
+ | 0.06 | 36.99 | 2423 | 0.0840 | 0.9799 |
92
+ | 0.0477 | 38.0 | 2489 | 0.0659 | 0.9857 |
93
+ | 0.0482 | 38.99 | 2554 | 0.0479 | 0.9895 |
94
+ | 0.0292 | 40.0 | 2620 | 0.0699 | 0.9818 |
95
+ | 0.0386 | 40.99 | 2685 | 0.1030 | 0.9837 |
96
+ | 0.0441 | 42.0 | 2751 | 0.0801 | 0.9818 |
97
+ | 0.0269 | 42.99 | 2816 | 0.1037 | 0.9809 |
98
+ | 0.0385 | 44.0 | 2882 | 0.0870 | 0.9799 |
99
+ | 0.0502 | 44.99 | 2947 | 0.1367 | 0.9771 |
100
+ | 0.0389 | 46.0 | 3013 | 0.1093 | 0.9771 |
101
+ | 0.0209 | 46.99 | 3078 | 0.0954 | 0.9837 |
102
+ | 0.0327 | 48.0 | 3144 | 0.0886 | 0.9857 |
103
+ | 0.0269 | 48.99 | 3209 | 0.0767 | 0.9828 |
104
+ | 0.0461 | 50.0 | 3275 | 0.0661 | 0.9857 |
105
+ | 0.0226 | 50.99 | 3340 | 0.0769 | 0.9818 |
106
+ | 0.0304 | 52.0 | 3406 | 0.0841 | 0.9828 |
107
+ | 0.0326 | 52.99 | 3471 | 0.1002 | 0.9828 |
108
+ | 0.0593 | 54.0 | 3537 | 0.0634 | 0.9847 |
109
+ | 0.0489 | 54.99 | 3602 | 0.0702 | 0.9837 |
110
+ | 0.0495 | 56.0 | 3668 | 0.1060 | 0.9809 |
111
+ | 0.0457 | 56.99 | 3733 | 0.0715 | 0.9866 |
112
+ | 0.0487 | 58.0 | 3799 | 0.0906 | 0.9818 |
113
+ | 0.0416 | 58.99 | 3864 | 0.0973 | 0.9790 |
114
+ | 0.0358 | 60.0 | 3930 | 0.0887 | 0.9857 |
115
+ | 0.0503 | 60.99 | 3995 | 0.0959 | 0.9809 |
116
+ | 0.0555 | 62.0 | 4061 | 0.1057 | 0.9780 |
117
+ | 0.0288 | 62.99 | 4126 | 0.0971 | 0.9799 |
118
+ | 0.0514 | 64.0 | 4192 | 0.0754 | 0.9847 |
119
+ | 0.0602 | 64.99 | 4257 | 0.0789 | 0.9837 |
120
+ | 0.0209 | 66.0 | 4323 | 0.1005 | 0.9837 |
121
+ | 0.0366 | 66.99 | 4388 | 0.1070 | 0.9818 |
122
+ | 0.031 | 68.0 | 4454 | 0.1018 | 0.9818 |
123
+ | 0.043 | 68.99 | 4519 | 0.1020 | 0.9828 |
124
+ | 0.0262 | 70.0 | 4585 | 0.0896 | 0.9837 |
125
+ | 0.0299 | 70.99 | 4650 | 0.0913 | 0.9837 |
126
+ | 0.0211 | 72.0 | 4716 | 0.0957 | 0.9857 |
127
+ | 0.0351 | 72.99 | 4781 | 0.1180 | 0.9818 |
128
+ | 0.0498 | 74.0 | 4847 | 0.1056 | 0.9828 |
129
+ | 0.0174 | 74.99 | 4912 | 0.1032 | 0.9809 |
130
+ | 0.0368 | 76.0 | 4978 | 0.1071 | 0.9790 |
131
+ | 0.0367 | 76.99 | 5043 | 0.0987 | 0.9828 |
132
+ | 0.027 | 78.0 | 5109 | 0.1037 | 0.9818 |
133
+ | 0.0225 | 78.99 | 5174 | 0.1129 | 0.9809 |
134
+ | 0.0241 | 80.0 | 5240 | 0.1202 | 0.9828 |
135
+ | 0.026 | 80.99 | 5305 | 0.1219 | 0.9790 |
136
+ | 0.0223 | 82.0 | 5371 | 0.1194 | 0.9799 |
137
+ | 0.0454 | 82.99 | 5436 | 0.1148 | 0.9790 |
138
+ | 0.019 | 84.0 | 5502 | 0.1168 | 0.9818 |
139
+ | 0.0269 | 84.99 | 5567 | 0.1246 | 0.9799 |
140
+ | 0.0403 | 86.0 | 5633 | 0.1301 | 0.9790 |
141
+ | 0.0294 | 86.99 | 5698 | 0.1204 | 0.9799 |
142
+ | 0.0501 | 88.0 | 5764 | 0.1168 | 0.9790 |
143
+ | 0.0361 | 88.99 | 5829 | 0.1143 | 0.9818 |
144
+ | 0.0278 | 90.0 | 5895 | 0.1029 | 0.9799 |
145
+ | 0.0267 | 90.99 | 5960 | 0.0991 | 0.9818 |
146
+ | 0.0308 | 92.0 | 6026 | 0.1028 | 0.9828 |
147
+ | 0.0246 | 92.99 | 6091 | 0.1031 | 0.9809 |
148
+ | 0.0283 | 94.0 | 6157 | 0.1035 | 0.9818 |
149
+ | 0.0278 | 94.99 | 6222 | 0.0999 | 0.9818 |
150
+ | 0.0221 | 96.0 | 6288 | 0.1007 | 0.9809 |
151
+ | 0.0197 | 96.99 | 6353 | 0.0989 | 0.9818 |
152
+ | 0.0435 | 98.0 | 6419 | 0.0986 | 0.9818 |
153
+ | 0.0266 | 98.99 | 6484 | 0.0987 | 0.9818 |
154
+ | 0.0334 | 99.24 | 6500 | 0.0987 | 0.9818 |
155
+
156
+
157
+ ### Framework versions
158
+
159
+ - Transformers 4.35.2
160
+ - Pytorch 2.1.0+cu118
161
+ - Datasets 2.14.7
162
+ - Tokenizers 0.15.0
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:12cfa02e05570696677e4983bb433cb28d42165af82b2e132b94d3ddd9e66269
3
  size 110348984
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:30158a2640e1a6394042b054cda6216bb75f25387566271d0ecde1418fe1ed8b
3
  size 110348984
runs/Nov16_07-23-29_ec676460ea77/events.out.tfevents.1700119417.ec676460ea77.5323.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:4a46039f2c24cb276aabe5562f3af0ca2d98e92b825dd0c2bf7051d214df189f
3
- size 138390
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c76e97e20772460975fe3d95f0bd67149c456a6b4f5eff2709d1a5e50e2f3894
3
+ size 139381