Amine
commited on
Commit
·
ad7dfe0
1
Parent(s):
7178eec
MT-fallen-firebrand-113
Browse files- README.md +199 -0
- pytorch_model.bin +1 -1
README.md
ADDED
@@ -0,0 +1,199 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
base_model: toobiza/MT-smart-feather-100
|
3 |
+
tags:
|
4 |
+
- generated_from_trainer
|
5 |
+
model-index:
|
6 |
+
- name: MT-fallen-firebrand-113
|
7 |
+
results: []
|
8 |
+
---
|
9 |
+
|
10 |
+
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
11 |
+
should probably proofread and complete it, then remove this comment. -->
|
12 |
+
|
13 |
+
# MT-fallen-firebrand-113
|
14 |
+
|
15 |
+
This model is a fine-tuned version of [toobiza/MT-smart-feather-100](https://huggingface.co/toobiza/MT-smart-feather-100) on an unknown dataset.
|
16 |
+
It achieves the following results on the evaluation set:
|
17 |
+
- Loss: 0.2177
|
18 |
+
- Loss Ce: 0.0007
|
19 |
+
- Loss Bbox: 0.0252
|
20 |
+
- Cardinality Error: 1.0
|
21 |
+
- Giou: 95.5116
|
22 |
+
|
23 |
+
## Model description
|
24 |
+
|
25 |
+
More information needed
|
26 |
+
|
27 |
+
## Intended uses & limitations
|
28 |
+
|
29 |
+
More information needed
|
30 |
+
|
31 |
+
## Training and evaluation data
|
32 |
+
|
33 |
+
More information needed
|
34 |
+
|
35 |
+
## Training procedure
|
36 |
+
|
37 |
+
### Training hyperparameters
|
38 |
+
|
39 |
+
The following hyperparameters were used during training:
|
40 |
+
- learning_rate: 1e-05
|
41 |
+
- train_batch_size: 4
|
42 |
+
- eval_batch_size: 4
|
43 |
+
- seed: 42
|
44 |
+
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
45 |
+
- lr_scheduler_type: linear
|
46 |
+
- num_epochs: 3
|
47 |
+
|
48 |
+
### Training results
|
49 |
+
|
50 |
+
| Training Loss | Epoch | Step | Validation Loss | Loss Ce | Loss Bbox | Cardinality Error | Giou |
|
51 |
+
|:-------------:|:-----:|:----:|:---------------:|:-------:|:---------:|:-----------------:|:-------:|
|
52 |
+
| 0.0433 | 0.02 | 10 | 0.2285 | 0.0005 | 0.0272 | 1.0 | 95.4320 |
|
53 |
+
| 0.1123 | 0.04 | 20 | 0.2205 | 0.0006 | 0.0264 | 1.0 | 95.6513 |
|
54 |
+
| 0.2672 | 0.06 | 30 | 0.2229 | 0.0007 | 0.0264 | 1.0 | 95.5263 |
|
55 |
+
| 0.1671 | 0.09 | 40 | 0.2266 | 0.0006 | 0.0267 | 1.0 | 95.4210 |
|
56 |
+
| 0.0761 | 0.11 | 50 | 0.2268 | 0.0006 | 0.0272 | 1.0 | 95.5399 |
|
57 |
+
| 0.0882 | 0.13 | 60 | 0.2234 | 0.0006 | 0.0269 | 1.0 | 95.6263 |
|
58 |
+
| 0.0525 | 0.15 | 70 | 0.2277 | 0.0006 | 0.0273 | 1.0 | 95.5031 |
|
59 |
+
| 0.0789 | 0.17 | 80 | 0.2277 | 0.0006 | 0.0271 | 1.0 | 95.4728 |
|
60 |
+
| 0.0661 | 0.19 | 90 | 0.2315 | 0.0006 | 0.0270 | 1.0 | 95.2434 |
|
61 |
+
| 0.0534 | 0.21 | 100 | 0.2304 | 0.0006 | 0.0270 | 1.0 | 95.2914 |
|
62 |
+
| 0.133 | 0.23 | 110 | 0.2318 | 0.0007 | 0.0268 | 1.0 | 95.2015 |
|
63 |
+
| 0.0908 | 0.26 | 120 | 0.2381 | 0.0006 | 0.0282 | 1.0 | 95.2005 |
|
64 |
+
| 0.0893 | 0.28 | 130 | 0.2388 | 0.0005 | 0.0279 | 1.0 | 95.1057 |
|
65 |
+
| 0.0516 | 0.3 | 140 | 0.2389 | 0.0005 | 0.0279 | 1.0 | 95.1262 |
|
66 |
+
| 0.2496 | 0.32 | 150 | 0.2378 | 0.0005 | 0.0277 | 1.0 | 95.1056 |
|
67 |
+
| 0.0914 | 0.34 | 160 | 0.2278 | 0.0005 | 0.0262 | 1.0 | 95.2652 |
|
68 |
+
| 0.1489 | 0.36 | 170 | 0.2206 | 0.0006 | 0.0261 | 1.0 | 95.5911 |
|
69 |
+
| 0.2304 | 0.38 | 180 | 0.2192 | 0.0007 | 0.0260 | 1.0 | 95.6501 |
|
70 |
+
| 0.058 | 0.41 | 190 | 0.2187 | 0.0007 | 0.0257 | 1.0 | 95.6025 |
|
71 |
+
| 0.2195 | 0.43 | 200 | 0.2179 | 0.0006 | 0.0257 | 1.0 | 95.6214 |
|
72 |
+
| 0.076 | 0.45 | 210 | 0.2188 | 0.0006 | 0.0256 | 1.0 | 95.5425 |
|
73 |
+
| 0.054 | 0.47 | 220 | 0.2206 | 0.0006 | 0.0256 | 1.0 | 95.4653 |
|
74 |
+
| 0.1074 | 0.49 | 230 | 0.2260 | 0.0006 | 0.0259 | 1.0 | 95.2627 |
|
75 |
+
| 0.0648 | 0.51 | 240 | 0.2270 | 0.0006 | 0.0268 | 1.0 | 95.4231 |
|
76 |
+
| 0.2838 | 0.53 | 250 | 0.2227 | 0.0006 | 0.0263 | 1.0 | 95.5340 |
|
77 |
+
| 0.1706 | 0.55 | 260 | 0.2238 | 0.0006 | 0.0260 | 1.0 | 95.4056 |
|
78 |
+
| 0.065 | 0.58 | 270 | 0.2260 | 0.0007 | 0.0258 | 1.0 | 95.2666 |
|
79 |
+
| 0.063 | 0.6 | 280 | 0.2238 | 0.0007 | 0.0262 | 1.0 | 95.4676 |
|
80 |
+
| 0.0667 | 0.62 | 290 | 0.2206 | 0.0006 | 0.0262 | 1.0 | 95.6224 |
|
81 |
+
| 0.087 | 0.64 | 300 | 0.2238 | 0.0006 | 0.0259 | 1.0 | 95.3894 |
|
82 |
+
| 0.1819 | 0.66 | 310 | 0.2290 | 0.0006 | 0.0261 | 1.0 | 95.1798 |
|
83 |
+
| 0.1198 | 0.68 | 320 | 0.2251 | 0.0007 | 0.0264 | 1.0 | 95.4342 |
|
84 |
+
| 0.0732 | 0.7 | 330 | 0.2203 | 0.0007 | 0.0259 | 1.0 | 95.5583 |
|
85 |
+
| 0.1283 | 0.72 | 340 | 0.2255 | 0.0008 | 0.0258 | 1.0 | 95.2776 |
|
86 |
+
| 0.1481 | 0.75 | 350 | 0.2243 | 0.0008 | 0.0259 | 1.0 | 95.3665 |
|
87 |
+
| 0.1448 | 0.77 | 360 | 0.2199 | 0.0008 | 0.0259 | 1.0 | 95.5715 |
|
88 |
+
| 0.0831 | 0.79 | 370 | 0.2228 | 0.0008 | 0.0261 | 1.0 | 95.5002 |
|
89 |
+
| 0.0936 | 0.81 | 380 | 0.2251 | 0.0007 | 0.0260 | 1.0 | 95.3486 |
|
90 |
+
| 0.0715 | 0.83 | 390 | 0.2204 | 0.0007 | 0.0257 | 1.0 | 95.5102 |
|
91 |
+
| 0.0672 | 0.85 | 400 | 0.2234 | 0.0007 | 0.0256 | 1.0 | 95.3418 |
|
92 |
+
| 0.0713 | 0.87 | 410 | 0.2233 | 0.0007 | 0.0259 | 1.0 | 95.3986 |
|
93 |
+
| 0.0776 | 0.9 | 420 | 0.2269 | 0.0007 | 0.0259 | 1.0 | 95.2251 |
|
94 |
+
| 0.129 | 0.92 | 430 | 0.2278 | 0.0007 | 0.0259 | 1.0 | 95.1863 |
|
95 |
+
| 0.1938 | 0.94 | 440 | 0.2315 | 0.0007 | 0.0262 | 1.0 | 95.0673 |
|
96 |
+
| 0.0841 | 0.96 | 450 | 0.2271 | 0.0007 | 0.0258 | 1.0 | 95.1982 |
|
97 |
+
| 0.1348 | 0.98 | 460 | 0.2233 | 0.0008 | 0.0258 | 1.0 | 95.4038 |
|
98 |
+
| 0.0668 | 1.0 | 470 | 0.2243 | 0.0008 | 0.0257 | 1.0 | 95.3235 |
|
99 |
+
| 0.3109 | 1.02 | 480 | 0.2278 | 0.0008 | 0.0258 | 1.0 | 95.1576 |
|
100 |
+
| 0.069 | 1.04 | 490 | 0.2312 | 0.0007 | 0.0257 | 1.0 | 94.9736 |
|
101 |
+
| 0.0627 | 1.07 | 500 | 0.2284 | 0.0007 | 0.0254 | 1.0 | 95.0170 |
|
102 |
+
| 0.0852 | 1.09 | 510 | 0.2291 | 0.0006 | 0.0252 | 1.0 | 94.9385 |
|
103 |
+
| 0.0679 | 1.11 | 520 | 0.2259 | 0.0006 | 0.0253 | 1.0 | 95.1204 |
|
104 |
+
| 0.0543 | 1.13 | 530 | 0.2274 | 0.0006 | 0.0251 | 1.0 | 94.9929 |
|
105 |
+
| 0.1877 | 1.15 | 540 | 0.2248 | 0.0007 | 0.0249 | 1.0 | 95.0927 |
|
106 |
+
| 0.1028 | 1.17 | 550 | 0.2177 | 0.0007 | 0.0251 | 1.0 | 95.4758 |
|
107 |
+
| 0.068 | 1.19 | 560 | 0.2142 | 0.0007 | 0.0249 | 1.0 | 95.6181 |
|
108 |
+
| 0.0953 | 1.22 | 570 | 0.2120 | 0.0007 | 0.0248 | 1.0 | 95.6968 |
|
109 |
+
| 0.0548 | 1.24 | 580 | 0.2160 | 0.0007 | 0.0249 | 1.0 | 95.5313 |
|
110 |
+
| 0.2291 | 1.26 | 590 | 0.2192 | 0.0007 | 0.0252 | 1.0 | 95.4303 |
|
111 |
+
| 0.0565 | 1.28 | 600 | 0.2148 | 0.0007 | 0.0246 | 1.0 | 95.5175 |
|
112 |
+
| 0.0643 | 1.3 | 610 | 0.2165 | 0.0007 | 0.0248 | 1.0 | 95.4779 |
|
113 |
+
| 0.0527 | 1.32 | 620 | 0.2134 | 0.0007 | 0.0244 | 1.0 | 95.5284 |
|
114 |
+
| 0.1215 | 1.34 | 630 | 0.2106 | 0.0007 | 0.0240 | 1.0 | 95.5701 |
|
115 |
+
| 0.0851 | 1.36 | 640 | 0.2148 | 0.0007 | 0.0247 | 1.0 | 95.5315 |
|
116 |
+
| 0.0748 | 1.39 | 650 | 0.2188 | 0.0007 | 0.0252 | 1.0 | 95.4600 |
|
117 |
+
| 0.0632 | 1.41 | 660 | 0.2215 | 0.0007 | 0.0254 | 1.0 | 95.3640 |
|
118 |
+
| 0.0634 | 1.43 | 670 | 0.2232 | 0.0007 | 0.0255 | 1.0 | 95.3053 |
|
119 |
+
| 0.0851 | 1.45 | 680 | 0.2235 | 0.0006 | 0.0256 | 1.0 | 95.3000 |
|
120 |
+
| 0.0597 | 1.47 | 690 | 0.2250 | 0.0006 | 0.0255 | 1.0 | 95.2242 |
|
121 |
+
| 0.0522 | 1.49 | 700 | 0.2252 | 0.0007 | 0.0259 | 1.0 | 95.3196 |
|
122 |
+
| 0.1491 | 1.51 | 710 | 0.2250 | 0.0007 | 0.0260 | 1.0 | 95.3516 |
|
123 |
+
| 0.0515 | 1.54 | 720 | 0.2205 | 0.0007 | 0.0254 | 1.0 | 95.4144 |
|
124 |
+
| 0.1826 | 1.56 | 730 | 0.2199 | 0.0007 | 0.0254 | 1.0 | 95.4645 |
|
125 |
+
| 0.0529 | 1.58 | 740 | 0.2230 | 0.0007 | 0.0254 | 1.0 | 95.2972 |
|
126 |
+
| 0.0587 | 1.6 | 750 | 0.2229 | 0.0007 | 0.0256 | 1.0 | 95.3546 |
|
127 |
+
| 0.0589 | 1.62 | 760 | 0.2216 | 0.0007 | 0.0256 | 1.0 | 95.4016 |
|
128 |
+
| 0.0715 | 1.64 | 770 | 0.2216 | 0.0006 | 0.0254 | 1.0 | 95.3633 |
|
129 |
+
| 0.1186 | 1.66 | 780 | 0.2196 | 0.0006 | 0.0254 | 1.0 | 95.4553 |
|
130 |
+
| 0.0498 | 1.68 | 790 | 0.2179 | 0.0006 | 0.0255 | 1.0 | 95.5713 |
|
131 |
+
| 0.1199 | 1.71 | 800 | 0.2192 | 0.0006 | 0.0257 | 1.0 | 95.5583 |
|
132 |
+
| 0.2151 | 1.73 | 810 | 0.2196 | 0.0006 | 0.0257 | 1.0 | 95.5410 |
|
133 |
+
| 0.1703 | 1.75 | 820 | 0.2211 | 0.0006 | 0.0259 | 1.0 | 95.4973 |
|
134 |
+
| 0.0782 | 1.77 | 830 | 0.2244 | 0.0006 | 0.0259 | 1.0 | 95.3385 |
|
135 |
+
| 0.0575 | 1.79 | 840 | 0.2258 | 0.0006 | 0.0262 | 1.0 | 95.3399 |
|
136 |
+
| 0.1211 | 1.81 | 850 | 0.2239 | 0.0006 | 0.0261 | 1.0 | 95.4278 |
|
137 |
+
| 0.067 | 1.83 | 860 | 0.2232 | 0.0006 | 0.0261 | 1.0 | 95.4541 |
|
138 |
+
| 0.0521 | 1.86 | 870 | 0.2214 | 0.0006 | 0.0258 | 1.0 | 95.4654 |
|
139 |
+
| 0.1619 | 1.88 | 880 | 0.2206 | 0.0006 | 0.0256 | 1.0 | 95.4628 |
|
140 |
+
| 0.0656 | 1.9 | 890 | 0.2217 | 0.0006 | 0.0257 | 1.0 | 95.4210 |
|
141 |
+
| 0.1299 | 1.92 | 900 | 0.2216 | 0.0006 | 0.0257 | 1.0 | 95.4406 |
|
142 |
+
| 0.1795 | 1.94 | 910 | 0.2258 | 0.0006 | 0.0260 | 1.0 | 95.2944 |
|
143 |
+
| 0.0684 | 1.96 | 920 | 0.2288 | 0.0006 | 0.0257 | 1.0 | 95.0962 |
|
144 |
+
| 0.0683 | 1.98 | 930 | 0.2287 | 0.0006 | 0.0254 | 1.0 | 95.0161 |
|
145 |
+
| 0.1629 | 2.0 | 940 | 0.2300 | 0.0006 | 0.0255 | 1.0 | 94.9721 |
|
146 |
+
| 0.2252 | 2.03 | 950 | 0.2283 | 0.0006 | 0.0256 | 1.0 | 95.0778 |
|
147 |
+
| 0.097 | 2.05 | 960 | 0.2246 | 0.0006 | 0.0255 | 1.0 | 95.2448 |
|
148 |
+
| 0.0582 | 2.07 | 970 | 0.2247 | 0.0007 | 0.0253 | 1.0 | 95.2004 |
|
149 |
+
| 0.0613 | 2.09 | 980 | 0.2189 | 0.0007 | 0.0251 | 1.0 | 95.4246 |
|
150 |
+
| 0.055 | 2.11 | 990 | 0.2192 | 0.0007 | 0.0252 | 1.0 | 95.4337 |
|
151 |
+
| 0.1202 | 2.13 | 1000 | 0.2233 | 0.0007 | 0.0254 | 1.0 | 95.2810 |
|
152 |
+
| 0.0426 | 2.15 | 1010 | 0.2266 | 0.0006 | 0.0255 | 1.0 | 95.1437 |
|
153 |
+
| 0.0542 | 2.17 | 1020 | 0.2264 | 0.0006 | 0.0255 | 1.0 | 95.1477 |
|
154 |
+
| 0.0693 | 2.2 | 1030 | 0.2275 | 0.0006 | 0.0256 | 1.0 | 95.1170 |
|
155 |
+
| 0.0617 | 2.22 | 1040 | 0.2235 | 0.0006 | 0.0253 | 1.0 | 95.2543 |
|
156 |
+
| 0.0703 | 2.24 | 1050 | 0.2223 | 0.0007 | 0.0252 | 1.0 | 95.2883 |
|
157 |
+
| 0.1624 | 2.26 | 1060 | 0.2231 | 0.0007 | 0.0252 | 1.0 | 95.2493 |
|
158 |
+
| 0.1168 | 2.28 | 1070 | 0.2235 | 0.0007 | 0.0252 | 1.0 | 95.2330 |
|
159 |
+
| 0.0784 | 2.3 | 1080 | 0.2207 | 0.0007 | 0.0252 | 1.0 | 95.3759 |
|
160 |
+
| 0.0627 | 2.32 | 1090 | 0.2205 | 0.0007 | 0.0253 | 1.0 | 95.4080 |
|
161 |
+
| 0.0856 | 2.35 | 1100 | 0.2202 | 0.0007 | 0.0254 | 1.0 | 95.4356 |
|
162 |
+
| 0.0587 | 2.37 | 1110 | 0.2198 | 0.0007 | 0.0254 | 1.0 | 95.4616 |
|
163 |
+
| 0.0591 | 2.39 | 1120 | 0.2181 | 0.0007 | 0.0253 | 1.0 | 95.5062 |
|
164 |
+
| 0.3139 | 2.41 | 1130 | 0.2149 | 0.0007 | 0.0247 | 1.0 | 95.5317 |
|
165 |
+
| 0.0535 | 2.43 | 1140 | 0.2124 | 0.0007 | 0.0244 | 1.0 | 95.5952 |
|
166 |
+
| 0.0502 | 2.45 | 1150 | 0.2135 | 0.0007 | 0.0245 | 1.0 | 95.5379 |
|
167 |
+
| 0.0574 | 2.47 | 1160 | 0.2142 | 0.0007 | 0.0247 | 1.0 | 95.5711 |
|
168 |
+
| 0.0415 | 2.49 | 1170 | 0.2143 | 0.0007 | 0.0249 | 1.0 | 95.6145 |
|
169 |
+
| 0.0394 | 2.52 | 1180 | 0.2141 | 0.0008 | 0.0249 | 1.0 | 95.6154 |
|
170 |
+
| 0.1392 | 2.54 | 1190 | 0.2128 | 0.0008 | 0.0249 | 1.0 | 95.6786 |
|
171 |
+
| 0.1257 | 2.56 | 1200 | 0.2130 | 0.0007 | 0.0249 | 1.0 | 95.6791 |
|
172 |
+
| 0.0493 | 2.58 | 1210 | 0.2149 | 0.0007 | 0.0249 | 1.0 | 95.5834 |
|
173 |
+
| 0.1574 | 2.6 | 1220 | 0.2173 | 0.0007 | 0.0250 | 1.0 | 95.4966 |
|
174 |
+
| 0.077 | 2.62 | 1230 | 0.2183 | 0.0008 | 0.0251 | 1.0 | 95.4721 |
|
175 |
+
| 0.0616 | 2.64 | 1240 | 0.2172 | 0.0008 | 0.0252 | 1.0 | 95.5329 |
|
176 |
+
| 0.0515 | 2.67 | 1250 | 0.2169 | 0.0008 | 0.0252 | 1.0 | 95.5528 |
|
177 |
+
| 0.0464 | 2.69 | 1260 | 0.2158 | 0.0008 | 0.0252 | 1.0 | 95.6043 |
|
178 |
+
| 0.1273 | 2.71 | 1270 | 0.2151 | 0.0008 | 0.0252 | 1.0 | 95.6373 |
|
179 |
+
| 0.0602 | 2.73 | 1280 | 0.2157 | 0.0008 | 0.0252 | 1.0 | 95.6047 |
|
180 |
+
| 0.1282 | 2.75 | 1290 | 0.2166 | 0.0008 | 0.0251 | 1.0 | 95.5521 |
|
181 |
+
| 0.0552 | 2.77 | 1300 | 0.2172 | 0.0008 | 0.0252 | 1.0 | 95.5391 |
|
182 |
+
| 0.1731 | 2.79 | 1310 | 0.2176 | 0.0007 | 0.0252 | 1.0 | 95.5261 |
|
183 |
+
| 0.0999 | 2.81 | 1320 | 0.2169 | 0.0007 | 0.0252 | 1.0 | 95.5514 |
|
184 |
+
| 0.1018 | 2.84 | 1330 | 0.2169 | 0.0007 | 0.0252 | 1.0 | 95.5517 |
|
185 |
+
| 0.1837 | 2.86 | 1340 | 0.2172 | 0.0007 | 0.0252 | 1.0 | 95.5448 |
|
186 |
+
| 0.0582 | 2.88 | 1350 | 0.2165 | 0.0007 | 0.0252 | 1.0 | 95.5744 |
|
187 |
+
| 0.0671 | 2.9 | 1360 | 0.2171 | 0.0008 | 0.0252 | 1.0 | 95.5425 |
|
188 |
+
| 0.0451 | 2.92 | 1370 | 0.2176 | 0.0007 | 0.0252 | 1.0 | 95.5208 |
|
189 |
+
| 0.0829 | 2.94 | 1380 | 0.2173 | 0.0007 | 0.0252 | 1.0 | 95.5369 |
|
190 |
+
| 0.0575 | 2.96 | 1390 | 0.2176 | 0.0007 | 0.0252 | 1.0 | 95.5150 |
|
191 |
+
| 0.0926 | 2.99 | 1400 | 0.2177 | 0.0007 | 0.0252 | 1.0 | 95.5116 |
|
192 |
+
|
193 |
+
|
194 |
+
### Framework versions
|
195 |
+
|
196 |
+
- Transformers 4.33.2
|
197 |
+
- Pytorch 2.1.0+cu118
|
198 |
+
- Datasets 2.14.6
|
199 |
+
- Tokenizers 0.13.3
|
pytorch_model.bin
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 115385222
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:1df3002d40e4a4e37f44c8053db9cf535da61f254ea5ccb11fbd6e60b085492e
|
3 |
size 115385222
|