bobbyw commited on
Commit
c607ec4
·
verified ·
1 Parent(s): 55871ef

End of training

Browse files
README.md CHANGED
@@ -20,11 +20,12 @@ should probably proofread and complete it, then remove this comment. -->
20
 
21
  This model is a fine-tuned version of [microsoft/deberta-v3-small](https://huggingface.co/microsoft/deberta-v3-small) on an unknown dataset.
22
  It achieves the following results on the evaluation set:
23
- - Loss: 0.0374
24
- - Accuracy: 0.0069
25
- - F1: 0.0137
26
- - Precision: 0.0069
27
- - Recall: 1.0
 
28
 
29
  ## Model description
30
 
@@ -43,43 +44,123 @@ More information needed
43
  ### Training hyperparameters
44
 
45
  The following hyperparameters were used during training:
46
- - learning_rate: 0.002
47
- - train_batch_size: 16
48
- - eval_batch_size: 16
49
  - seed: 42
50
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
51
  - lr_scheduler_type: linear
52
- - num_epochs: 20
53
 
54
  ### Training results
55
 
56
- | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall |
57
- |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|:---------:|:------:|
58
- | No log | 1.0 | 39 | 0.0447 | 0.0069 | 0.0137 | 0.0069 | 1.0 |
59
- | No log | 2.0 | 78 | 0.0401 | 0.0069 | 0.0137 | 0.0069 | 1.0 |
60
- | No log | 3.0 | 117 | 0.0401 | 0.0069 | 0.0137 | 0.0069 | 1.0 |
61
- | No log | 4.0 | 156 | 0.0395 | 0.0069 | 0.0137 | 0.0069 | 1.0 |
62
- | No log | 5.0 | 195 | 0.0379 | 0.0069 | 0.0137 | 0.0069 | 1.0 |
63
- | No log | 6.0 | 234 | 0.0374 | 0.0069 | 0.0137 | 0.0069 | 1.0 |
64
- | No log | 7.0 | 273 | 0.0383 | 0.0069 | 0.0137 | 0.0069 | 1.0 |
65
- | No log | 8.0 | 312 | 0.0378 | 0.0069 | 0.0137 | 0.0069 | 1.0 |
66
- | No log | 9.0 | 351 | 0.0384 | 0.0069 | 0.0137 | 0.0069 | 1.0 |
67
- | No log | 10.0 | 390 | 0.0371 | 0.0069 | 0.0137 | 0.0069 | 1.0 |
68
- | No log | 11.0 | 429 | 0.0378 | 0.0069 | 0.0137 | 0.0069 | 1.0 |
69
- | No log | 12.0 | 468 | 0.0371 | 0.0069 | 0.0137 | 0.0069 | 1.0 |
70
- | 0.037 | 13.0 | 507 | 0.0373 | 0.0069 | 0.0137 | 0.0069 | 1.0 |
71
- | 0.037 | 14.0 | 546 | 0.0380 | 0.0069 | 0.0137 | 0.0069 | 1.0 |
72
- | 0.037 | 15.0 | 585 | 0.0390 | 0.0069 | 0.0137 | 0.0069 | 1.0 |
73
- | 0.037 | 16.0 | 624 | 0.0373 | 0.0069 | 0.0137 | 0.0069 | 1.0 |
74
- | 0.037 | 17.0 | 663 | 0.0376 | 0.0069 | 0.0137 | 0.0069 | 1.0 |
75
- | 0.037 | 18.0 | 702 | 0.0371 | 0.0069 | 0.0137 | 0.0069 | 1.0 |
76
- | 0.037 | 19.0 | 741 | 0.0375 | 0.0069 | 0.0137 | 0.0069 | 1.0 |
77
- | 0.037 | 20.0 | 780 | 0.0374 | 0.0069 | 0.0137 | 0.0069 | 1.0 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
78
 
79
 
80
  ### Framework versions
81
 
82
- - Transformers 4.40.0
83
  - Pytorch 2.2.1+cu121
84
  - Datasets 2.19.0
85
  - Tokenizers 0.19.1
 
20
 
21
  This model is a fine-tuned version of [microsoft/deberta-v3-small](https://huggingface.co/microsoft/deberta-v3-small) on an unknown dataset.
22
  It achieves the following results on the evaluation set:
23
+ - Loss: 0.0134
24
+ - Accuracy: 0.0038
25
+ - F1: 0.0062
26
+ - Precision: 0.0031
27
+ - Recall: 0.625
28
+ - Learning Rate: 0.0
29
 
30
  ## Model description
31
 
 
44
  ### Training hyperparameters
45
 
46
  The following hyperparameters were used during training:
47
+ - learning_rate: 2e-05
48
+ - train_batch_size: 32
49
+ - eval_batch_size: 32
50
  - seed: 42
51
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
52
  - lr_scheduler_type: linear
53
+ - num_epochs: 100
54
 
55
  ### Training results
56
 
57
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall | Rate |
58
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|:---------:|:------:|:------:|
59
+ | No log | 1.0 | 20 | 0.5469 | 0.0994 | 0.0093 | 0.0047 | 0.8438 | 0.0000 |
60
+ | No log | 2.0 | 40 | 0.3859 | 0.0050 | 0.0100 | 0.0050 | 1.0 | 0.0000 |
61
+ | No log | 3.0 | 60 | 0.2662 | 0.0050 | 0.0100 | 0.0050 | 1.0 | 0.0000 |
62
+ | No log | 4.0 | 80 | 0.1781 | 0.0050 | 0.0100 | 0.0050 | 1.0 | 0.0000 |
63
+ | No log | 5.0 | 100 | 0.1183 | 0.0050 | 0.0100 | 0.0050 | 1.0 | 0.0000 |
64
+ | No log | 6.0 | 120 | 0.0823 | 0.0050 | 0.0100 | 0.0050 | 1.0 | 0.0000 |
65
+ | No log | 7.0 | 140 | 0.0614 | 0.0050 | 0.0100 | 0.0050 | 1.0 | 0.0000 |
66
+ | No log | 8.0 | 160 | 0.0494 | 0.0050 | 0.0100 | 0.0050 | 1.0 | 0.0000 |
67
+ | No log | 9.0 | 180 | 0.0423 | 0.0050 | 0.0100 | 0.0050 | 1.0 | 0.0000 |
68
+ | No log | 10.0 | 200 | 0.0379 | 0.0050 | 0.0100 | 0.0050 | 1.0 | 0.0000 |
69
+ | No log | 11.0 | 220 | 0.0350 | 0.0050 | 0.0100 | 0.0050 | 1.0 | 0.0000 |
70
+ | No log | 12.0 | 240 | 0.0331 | 0.0050 | 0.0100 | 0.0050 | 1.0 | 0.0000 |
71
+ | No log | 13.0 | 260 | 0.0318 | 0.0050 | 0.0100 | 0.0050 | 1.0 | 0.0000 |
72
+ | No log | 14.0 | 280 | 0.0307 | 0.0050 | 0.0100 | 0.0050 | 1.0 | 0.0000 |
73
+ | No log | 15.0 | 300 | 0.0300 | 0.0050 | 0.0100 | 0.0050 | 1.0 | 0.0000 |
74
+ | No log | 16.0 | 320 | 0.0294 | 0.0050 | 0.0100 | 0.0050 | 1.0 | 0.0000 |
75
+ | No log | 17.0 | 340 | 0.0290 | 0.0050 | 0.0100 | 0.0050 | 1.0 | 0.0000 |
76
+ | No log | 18.0 | 360 | 0.0286 | 0.0050 | 0.0100 | 0.0050 | 1.0 | 0.0000 |
77
+ | No log | 19.0 | 380 | 0.0283 | 0.0050 | 0.0100 | 0.0050 | 1.0 | 0.0000 |
78
+ | No log | 20.0 | 400 | 0.0300 | 0.0050 | 0.0100 | 0.0050 | 1.0 | 0.0000 |
79
+ | No log | 21.0 | 420 | 0.0290 | 0.0050 | 0.0100 | 0.0050 | 1.0 | 0.0000 |
80
+ | No log | 22.0 | 440 | 0.0252 | 0.0050 | 0.0100 | 0.0050 | 1.0 | 0.0000 |
81
+ | No log | 23.0 | 460 | 0.0246 | 0.0050 | 0.0100 | 0.0050 | 1.0 | 0.0000 |
82
+ | No log | 24.0 | 480 | 0.0242 | 0.0050 | 0.0100 | 0.0050 | 1.0 | 0.0000 |
83
+ | 0.1127 | 25.0 | 500 | 0.0239 | 0.0050 | 0.0100 | 0.0050 | 1.0 | 0.0000 |
84
+ | 0.1127 | 26.0 | 520 | 0.0233 | 0.0050 | 0.0100 | 0.0050 | 1.0 | 0.0000 |
85
+ | 0.1127 | 27.0 | 540 | 0.0226 | 0.0050 | 0.0100 | 0.0050 | 1.0 | 0.0000 |
86
+ | 0.1127 | 28.0 | 560 | 0.0224 | 0.0050 | 0.0100 | 0.0050 | 1.0 | 0.0000 |
87
+ | 0.1127 | 29.0 | 580 | 0.0217 | 0.0050 | 0.0100 | 0.0050 | 1.0 | 0.0000 |
88
+ | 0.1127 | 30.0 | 600 | 0.0211 | 0.0047 | 0.0093 | 0.0047 | 0.9375 | 0.0000 |
89
+ | 0.1127 | 31.0 | 620 | 0.0206 | 0.0045 | 0.0090 | 0.0045 | 0.9062 | 0.0000 |
90
+ | 0.1127 | 32.0 | 640 | 0.0207 | 0.0047 | 0.0090 | 0.0045 | 0.9062 | 0.0000 |
91
+ | 0.1127 | 33.0 | 660 | 0.0198 | 0.0045 | 0.0090 | 0.0045 | 0.9062 | 0.0000 |
92
+ | 0.1127 | 34.0 | 680 | 0.0205 | 0.0047 | 0.0090 | 0.0045 | 0.9062 | 0.0000 |
93
+ | 0.1127 | 35.0 | 700 | 0.0193 | 0.0045 | 0.0090 | 0.0045 | 0.9062 | 0.0000 |
94
+ | 0.1127 | 36.0 | 720 | 0.0198 | 0.0045 | 0.0090 | 0.0045 | 0.9062 | 0.0000 |
95
+ | 0.1127 | 37.0 | 740 | 0.0190 | 0.0047 | 0.0090 | 0.0045 | 0.9062 | 0.0000 |
96
+ | 0.1127 | 38.0 | 760 | 0.0197 | 0.0049 | 0.0090 | 0.0045 | 0.9062 | 0.0000 |
97
+ | 0.1127 | 39.0 | 780 | 0.0185 | 0.0047 | 0.0090 | 0.0045 | 0.9062 | 0.0000 |
98
+ | 0.1127 | 40.0 | 800 | 0.0184 | 0.0045 | 0.0090 | 0.0045 | 0.9062 | 0.0000 |
99
+ | 0.1127 | 41.0 | 820 | 0.0188 | 0.0045 | 0.0090 | 0.0045 | 0.9062 | 0.0000 |
100
+ | 0.1127 | 42.0 | 840 | 0.0179 | 0.0045 | 0.0090 | 0.0045 | 0.9062 | 0.0000 |
101
+ | 0.1127 | 43.0 | 860 | 0.0178 | 0.0045 | 0.0090 | 0.0045 | 0.9062 | 0.0000 |
102
+ | 0.1127 | 44.0 | 880 | 0.0174 | 0.0045 | 0.0090 | 0.0045 | 0.9062 | 0.0000 |
103
+ | 0.1127 | 45.0 | 900 | 0.0182 | 0.0041 | 0.0081 | 0.0041 | 0.8125 | 0.0000 |
104
+ | 0.1127 | 46.0 | 920 | 0.0171 | 0.0045 | 0.0090 | 0.0045 | 0.9062 | 0.0000 |
105
+ | 0.1127 | 47.0 | 940 | 0.0168 | 0.0044 | 0.0087 | 0.0044 | 0.875 | 0.0000 |
106
+ | 0.1127 | 48.0 | 960 | 0.0167 | 0.0041 | 0.0081 | 0.0041 | 0.8125 | 0.0000 |
107
+ | 0.1127 | 49.0 | 980 | 0.0165 | 0.0039 | 0.0078 | 0.0039 | 0.7812 | 0.0000 |
108
+ | 0.0253 | 50.0 | 1000 | 0.0162 | 0.0039 | 0.0078 | 0.0039 | 0.7812 | 1e-05 |
109
+ | 0.0253 | 51.0 | 1020 | 0.0160 | 0.0041 | 0.0081 | 0.0041 | 0.8125 | 0.0000 |
110
+ | 0.0253 | 52.0 | 1040 | 0.0159 | 0.0038 | 0.0075 | 0.0038 | 0.75 | 0.0000 |
111
+ | 0.0253 | 53.0 | 1060 | 0.0158 | 0.0038 | 0.0075 | 0.0038 | 0.75 | 0.0000 |
112
+ | 0.0253 | 54.0 | 1080 | 0.0163 | 0.0041 | 0.0075 | 0.0038 | 0.75 | 0.0000 |
113
+ | 0.0253 | 55.0 | 1100 | 0.0160 | 0.0039 | 0.0072 | 0.0036 | 0.7188 | 9e-06 |
114
+ | 0.0253 | 56.0 | 1120 | 0.0161 | 0.0034 | 0.0069 | 0.0034 | 0.6875 | 0.0000 |
115
+ | 0.0253 | 57.0 | 1140 | 0.0156 | 0.0036 | 0.0069 | 0.0034 | 0.6875 | 0.0000 |
116
+ | 0.0253 | 58.0 | 1160 | 0.0154 | 0.0041 | 0.0069 | 0.0034 | 0.6875 | 0.0000 |
117
+ | 0.0253 | 59.0 | 1180 | 0.0155 | 0.0039 | 0.0072 | 0.0036 | 0.7188 | 0.0000 |
118
+ | 0.0253 | 60.0 | 1200 | 0.0155 | 0.0036 | 0.0069 | 0.0034 | 0.6875 | 0.0000 |
119
+ | 0.0253 | 61.0 | 1220 | 0.0154 | 0.0038 | 0.0069 | 0.0034 | 0.6875 | 0.0000 |
120
+ | 0.0253 | 62.0 | 1240 | 0.0156 | 0.0041 | 0.0069 | 0.0034 | 0.6875 | 0.0000 |
121
+ | 0.0253 | 63.0 | 1260 | 0.0152 | 0.0038 | 0.0069 | 0.0034 | 0.6875 | 0.0000 |
122
+ | 0.0253 | 64.0 | 1280 | 0.0146 | 0.0036 | 0.0069 | 0.0034 | 0.6875 | 0.0000 |
123
+ | 0.0253 | 65.0 | 1300 | 0.0147 | 0.0041 | 0.0069 | 0.0034 | 0.6875 | 7e-06 |
124
+ | 0.0253 | 66.0 | 1320 | 0.0149 | 0.0039 | 0.0066 | 0.0033 | 0.6562 | 0.0000 |
125
+ | 0.0253 | 67.0 | 1340 | 0.0148 | 0.0038 | 0.0062 | 0.0031 | 0.625 | 0.0000 |
126
+ | 0.0253 | 68.0 | 1360 | 0.0148 | 0.0039 | 0.0066 | 0.0033 | 0.6562 | 0.0000 |
127
+ | 0.0253 | 69.0 | 1380 | 0.0143 | 0.0041 | 0.0069 | 0.0034 | 0.6875 | 0.0000 |
128
+ | 0.0253 | 70.0 | 1400 | 0.0144 | 0.0039 | 0.0062 | 0.0031 | 0.625 | 6e-06 |
129
+ | 0.0253 | 71.0 | 1420 | 0.0145 | 0.0039 | 0.0066 | 0.0033 | 0.6562 | 0.0000 |
130
+ | 0.0253 | 72.0 | 1440 | 0.0141 | 0.0038 | 0.0066 | 0.0033 | 0.6562 | 0.0000 |
131
+ | 0.0253 | 73.0 | 1460 | 0.0144 | 0.0039 | 0.0066 | 0.0033 | 0.6562 | 0.0000 |
132
+ | 0.0253 | 74.0 | 1480 | 0.0144 | 0.0039 | 0.0066 | 0.0033 | 0.6562 | 0.0000 |
133
+ | 0.019 | 75.0 | 1500 | 0.0142 | 0.0036 | 0.0062 | 0.0031 | 0.625 | 5e-06 |
134
+ | 0.019 | 76.0 | 1520 | 0.0140 | 0.0041 | 0.0066 | 0.0033 | 0.6562 | 0.0000 |
135
+ | 0.019 | 77.0 | 1540 | 0.0139 | 0.0039 | 0.0066 | 0.0033 | 0.6562 | 0.0000 |
136
+ | 0.019 | 78.0 | 1560 | 0.0140 | 0.0039 | 0.0066 | 0.0033 | 0.6562 | 0.0000 |
137
+ | 0.019 | 79.0 | 1580 | 0.0139 | 0.0038 | 0.0059 | 0.0030 | 0.5938 | 0.0000 |
138
+ | 0.019 | 80.0 | 1600 | 0.0139 | 0.0039 | 0.0066 | 0.0033 | 0.6562 | 0.0000 |
139
+ | 0.019 | 81.0 | 1620 | 0.0139 | 0.0042 | 0.0066 | 0.0033 | 0.6562 | 0.0000 |
140
+ | 0.019 | 82.0 | 1640 | 0.0136 | 0.0036 | 0.0062 | 0.0031 | 0.625 | 0.0000 |
141
+ | 0.019 | 83.0 | 1660 | 0.0138 | 0.0041 | 0.0062 | 0.0031 | 0.625 | 0.0000 |
142
+ | 0.019 | 84.0 | 1680 | 0.0136 | 0.0039 | 0.0059 | 0.0030 | 0.5938 | 0.0000 |
143
+ | 0.019 | 85.0 | 1700 | 0.0136 | 0.0038 | 0.0059 | 0.0030 | 0.5938 | 3e-06 |
144
+ | 0.019 | 86.0 | 1720 | 0.0136 | 0.0038 | 0.0059 | 0.0030 | 0.5938 | 0.0000 |
145
+ | 0.019 | 87.0 | 1740 | 0.0133 | 0.0038 | 0.0062 | 0.0031 | 0.625 | 0.0000 |
146
+ | 0.019 | 88.0 | 1760 | 0.0137 | 0.0039 | 0.0059 | 0.0030 | 0.5938 | 0.0000 |
147
+ | 0.019 | 89.0 | 1780 | 0.0134 | 0.0036 | 0.0059 | 0.0030 | 0.5938 | 0.0000 |
148
+ | 0.019 | 90.0 | 1800 | 0.0133 | 0.0038 | 0.0059 | 0.0030 | 0.5938 | 0.0000 |
149
+ | 0.019 | 91.0 | 1820 | 0.0137 | 0.0041 | 0.0066 | 0.0033 | 0.6562 | 0.0000 |
150
+ | 0.019 | 92.0 | 1840 | 0.0134 | 0.0038 | 0.0059 | 0.0030 | 0.5938 | 0.0000 |
151
+ | 0.019 | 93.0 | 1860 | 0.0135 | 0.0038 | 0.0059 | 0.0030 | 0.5938 | 0.0000 |
152
+ | 0.019 | 94.0 | 1880 | 0.0134 | 0.0038 | 0.0062 | 0.0031 | 0.625 | 0.0000 |
153
+ | 0.019 | 95.0 | 1900 | 0.0136 | 0.0039 | 0.0059 | 0.0030 | 0.5938 | 0.0000 |
154
+ | 0.019 | 96.0 | 1920 | 0.0135 | 0.0039 | 0.0059 | 0.0030 | 0.5938 | 0.0000 |
155
+ | 0.019 | 97.0 | 1940 | 0.0134 | 0.0038 | 0.0062 | 0.0031 | 0.625 | 0.0000 |
156
+ | 0.019 | 98.0 | 1960 | 0.0134 | 0.0038 | 0.0062 | 0.0031 | 0.625 | 0.0000 |
157
+ | 0.019 | 99.0 | 1980 | 0.0134 | 0.0038 | 0.0062 | 0.0031 | 0.625 | 0.0000 |
158
+ | 0.0156 | 100.0 | 2000 | 0.0134 | 0.0038 | 0.0062 | 0.0031 | 0.625 | 0.0 |
159
 
160
 
161
  ### Framework versions
162
 
163
+ - Transformers 4.40.1
164
  - Pytorch 2.2.1+cu121
165
  - Datasets 2.19.0
166
  - Tokenizers 0.19.1
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:fc9c7fdcf264e908bde4bf32f669ef95fddbf5b47861a41ffe3ed0fed8c3bdc2
3
  size 567740204
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7518820919cea1e236197e23912a2cf1c275179454603d4b8d5b9bed215b42f8
3
  size 567740204
runs/May01_02-33-22_93ad8adf7f6f/events.out.tfevents.1714530809.93ad8adf7f6f.2125.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:7db89fc4cbbad2c25934d1b1755a02f4b02569921bf8c63dc55902fa8c8f8b72
3
- size 63205
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:73f134f8a4060c335b054abf40a72e966a5adc313a5941075663d0c4517bc256
3
+ size 64358