End of training
Browse files
README.md
ADDED
@@ -0,0 +1,155 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
library_name: transformers
|
3 |
+
tags:
|
4 |
+
- generated_from_trainer
|
5 |
+
metrics:
|
6 |
+
- accuracy
|
7 |
+
model-index:
|
8 |
+
- name: xlnetFv4_ftis_noPretrain
|
9 |
+
results: []
|
10 |
+
---
|
11 |
+
|
12 |
+
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
13 |
+
should probably proofread and complete it, then remove this comment. -->
|
14 |
+
|
15 |
+
# xlnetFv4_ftis_noPretrain
|
16 |
+
|
17 |
+
This model is a fine-tuned version of [](https://huggingface.co/) on an unknown dataset.
|
18 |
+
It achieves the following results on the evaluation set:
|
19 |
+
- Loss: 1.7658
|
20 |
+
- Accuracy: 0.4289
|
21 |
+
- Macro F1: 0.1717
|
22 |
+
|
23 |
+
## Model description
|
24 |
+
|
25 |
+
More information needed
|
26 |
+
|
27 |
+
## Intended uses & limitations
|
28 |
+
|
29 |
+
More information needed
|
30 |
+
|
31 |
+
## Training and evaluation data
|
32 |
+
|
33 |
+
More information needed
|
34 |
+
|
35 |
+
## Training procedure
|
36 |
+
|
37 |
+
### Training hyperparameters
|
38 |
+
|
39 |
+
The following hyperparameters were used during training:
|
40 |
+
- learning_rate: 0.0001
|
41 |
+
- train_batch_size: 1
|
42 |
+
- eval_batch_size: 1
|
43 |
+
- seed: 42
|
44 |
+
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
45 |
+
- lr_scheduler_type: linear
|
46 |
+
- lr_scheduler_warmup_steps: 53850
|
47 |
+
- training_steps: 1077000
|
48 |
+
|
49 |
+
### Training results
|
50 |
+
|
51 |
+
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Macro F1 |
|
52 |
+
|:-------------:|:-------:|:-----:|:---------------:|:--------:|:--------:|
|
53 |
+
| 3.2778 | 0.0009 | 1000 | 3.4101 | 0.1492 | 0.0508 |
|
54 |
+
| 2.6169 | 1.0009 | 2000 | 3.0657 | 0.3318 | 0.0747 |
|
55 |
+
| 2.0767 | 2.0008 | 3000 | 2.5978 | 0.3767 | 0.0775 |
|
56 |
+
| 1.9754 | 3.0007 | 4000 | 2.4066 | 0.4042 | 0.0882 |
|
57 |
+
| 2.0542 | 4.0006 | 5000 | 2.2766 | 0.4015 | 0.0917 |
|
58 |
+
| 1.8771 | 5.0006 | 6000 | 2.2322 | 0.4058 | 0.1041 |
|
59 |
+
| 1.9728 | 6.0005 | 7000 | 2.1515 | 0.3971 | 0.1038 |
|
60 |
+
| 1.9356 | 7.0004 | 8000 | 2.1180 | 0.4048 | 0.1076 |
|
61 |
+
| 1.8861 | 8.0004 | 9000 | 2.0644 | 0.4044 | 0.1101 |
|
62 |
+
| 1.8417 | 9.0003 | 10000 | 2.0713 | 0.3948 | 0.1095 |
|
63 |
+
| 1.9464 | 10.0002 | 11000 | 2.0515 | 0.3801 | 0.1081 |
|
64 |
+
| 1.9529 | 11.0001 | 12000 | 2.0508 | 0.3754 | 0.1055 |
|
65 |
+
| 1.8001 | 12.0001 | 13000 | 2.0173 | 0.3900 | 0.1110 |
|
66 |
+
| 1.895 | 12.0010 | 14000 | 2.0202 | 0.3920 | 0.1123 |
|
67 |
+
| 1.9938 | 13.0009 | 15000 | 1.9929 | 0.3601 | 0.1032 |
|
68 |
+
| 1.9619 | 14.0009 | 16000 | 1.9595 | 0.3882 | 0.1153 |
|
69 |
+
| 1.9526 | 15.0008 | 17000 | 1.9886 | 0.3566 | 0.1079 |
|
70 |
+
| 1.792 | 16.0007 | 18000 | 1.8701 | 0.3935 | 0.1162 |
|
71 |
+
| 1.7947 | 17.0006 | 19000 | 1.9231 | 0.4008 | 0.1267 |
|
72 |
+
| 1.964 | 18.0006 | 20000 | 1.9059 | 0.4176 | 0.1343 |
|
73 |
+
| 1.8604 | 19.0005 | 21000 | 1.8850 | 0.3899 | 0.1239 |
|
74 |
+
| 1.9679 | 20.0004 | 22000 | 1.8949 | 0.4385 | 0.1398 |
|
75 |
+
| 1.9046 | 21.0004 | 23000 | 1.8807 | 0.3680 | 0.1262 |
|
76 |
+
| 1.786 | 22.0003 | 24000 | 1.8673 | 0.4111 | 0.1437 |
|
77 |
+
| 1.902 | 23.0002 | 25000 | 1.8611 | 0.4282 | 0.1429 |
|
78 |
+
| 2.0151 | 24.0001 | 26000 | 1.8620 | 0.3950 | 0.1350 |
|
79 |
+
| 1.8067 | 25.0001 | 27000 | 1.8583 | 0.4070 | 0.1387 |
|
80 |
+
| 1.7885 | 25.0010 | 28000 | 1.8710 | 0.4190 | 0.1427 |
|
81 |
+
| 1.9458 | 26.0009 | 29000 | 1.8955 | 0.4172 | 0.1382 |
|
82 |
+
| 1.8265 | 27.0009 | 30000 | 1.8044 | 0.4128 | 0.1385 |
|
83 |
+
| 1.8682 | 28.0008 | 31000 | 1.8223 | 0.3833 | 0.1313 |
|
84 |
+
| 1.8845 | 29.0007 | 32000 | 1.8387 | 0.3893 | 0.1348 |
|
85 |
+
| 1.8141 | 30.0006 | 33000 | 1.8078 | 0.4263 | 0.1465 |
|
86 |
+
| 1.9524 | 31.0006 | 34000 | 1.8476 | 0.4008 | 0.1424 |
|
87 |
+
| 1.8705 | 32.0005 | 35000 | 1.7960 | 0.3955 | 0.1365 |
|
88 |
+
| 1.9233 | 33.0004 | 36000 | 1.8082 | 0.4222 | 0.1515 |
|
89 |
+
| 1.7887 | 34.0004 | 37000 | 1.8289 | 0.4197 | 0.1392 |
|
90 |
+
| 1.8995 | 35.0003 | 38000 | 1.8094 | 0.3644 | 0.1302 |
|
91 |
+
| 1.9193 | 36.0002 | 39000 | 1.7976 | 0.4105 | 0.1333 |
|
92 |
+
| 1.807 | 37.0001 | 40000 | 1.7958 | 0.3994 | 0.1258 |
|
93 |
+
| 1.6897 | 38.0001 | 41000 | 1.8255 | 0.4009 | 0.1400 |
|
94 |
+
| 1.8465 | 38.0010 | 42000 | 1.8031 | 0.3839 | 0.1352 |
|
95 |
+
| 1.9148 | 39.0009 | 43000 | 1.8039 | 0.3886 | 0.1399 |
|
96 |
+
| 1.9204 | 40.0009 | 44000 | 1.8456 | 0.3819 | 0.1312 |
|
97 |
+
| 1.8785 | 41.0008 | 45000 | 1.8408 | 0.4060 | 0.1466 |
|
98 |
+
| 1.9642 | 42.0007 | 46000 | 1.8162 | 0.3727 | 0.1427 |
|
99 |
+
| 1.9356 | 43.0006 | 47000 | 1.8446 | 0.4005 | 0.1513 |
|
100 |
+
| 1.7817 | 44.0006 | 48000 | 1.7895 | 0.4063 | 0.1453 |
|
101 |
+
| 1.8289 | 45.0005 | 49000 | 1.7917 | 0.4138 | 0.1392 |
|
102 |
+
| 1.8518 | 46.0004 | 50000 | 1.8293 | 0.4007 | 0.1589 |
|
103 |
+
| 1.7782 | 47.0004 | 51000 | 1.7812 | 0.4088 | 0.1355 |
|
104 |
+
| 1.9633 | 48.0003 | 52000 | 1.8348 | 0.4056 | 0.1514 |
|
105 |
+
| 1.9718 | 49.0002 | 53000 | 1.7777 | 0.4426 | 0.1595 |
|
106 |
+
| 1.9248 | 50.0001 | 54000 | 1.7779 | 0.4174 | 0.1501 |
|
107 |
+
| 1.7681 | 51.0001 | 55000 | 1.7986 | 0.4014 | 0.1368 |
|
108 |
+
| 1.8174 | 51.0010 | 56000 | 1.8043 | 0.4056 | 0.1420 |
|
109 |
+
| 1.841 | 52.0009 | 57000 | 1.7879 | 0.3974 | 0.1425 |
|
110 |
+
| 1.8102 | 53.0009 | 58000 | 1.8208 | 0.4252 | 0.1552 |
|
111 |
+
| 1.8059 | 54.0008 | 59000 | 1.7790 | 0.4062 | 0.1501 |
|
112 |
+
| 1.8195 | 55.0007 | 60000 | 1.7846 | 0.4059 | 0.1538 |
|
113 |
+
| 1.7883 | 56.0006 | 61000 | 1.7771 | 0.3950 | 0.1535 |
|
114 |
+
| 1.8632 | 57.0006 | 62000 | 1.7695 | 0.4177 | 0.1601 |
|
115 |
+
| 1.9495 | 58.0005 | 63000 | 1.7984 | 0.4275 | 0.1601 |
|
116 |
+
| 1.9593 | 59.0004 | 64000 | 1.7376 | 0.4261 | 0.1488 |
|
117 |
+
| 1.8409 | 60.0004 | 65000 | 1.7984 | 0.3857 | 0.1509 |
|
118 |
+
| 1.8503 | 61.0003 | 66000 | 1.7980 | 0.3936 | 0.1593 |
|
119 |
+
| 1.9144 | 62.0002 | 67000 | 1.7780 | 0.4075 | 0.1613 |
|
120 |
+
| 1.8632 | 63.0001 | 68000 | 1.8192 | 0.4245 | 0.1565 |
|
121 |
+
| 1.7526 | 64.0001 | 69000 | 1.7383 | 0.4045 | 0.1525 |
|
122 |
+
| 1.6273 | 64.0010 | 70000 | 1.8099 | 0.4485 | 0.1603 |
|
123 |
+
| 1.8939 | 65.0009 | 71000 | 1.7749 | 0.4344 | 0.1708 |
|
124 |
+
| 1.8592 | 66.0009 | 72000 | 1.7911 | 0.3803 | 0.1374 |
|
125 |
+
| 1.8739 | 67.0008 | 73000 | 1.7427 | 0.4399 | 0.1599 |
|
126 |
+
| 1.7345 | 68.0007 | 74000 | 1.8108 | 0.4179 | 0.1396 |
|
127 |
+
| 1.9316 | 69.0006 | 75000 | 1.7658 | 0.4289 | 0.1717 |
|
128 |
+
| 1.7924 | 70.0006 | 76000 | 1.7827 | 0.4247 | 0.1662 |
|
129 |
+
| 1.8339 | 71.0005 | 77000 | 1.7344 | 0.4266 | 0.1623 |
|
130 |
+
| 1.9731 | 72.0004 | 78000 | 1.8000 | 0.3535 | 0.1539 |
|
131 |
+
| 1.8868 | 73.0004 | 79000 | 1.7762 | 0.3975 | 0.1566 |
|
132 |
+
| 1.8885 | 74.0003 | 80000 | 1.7581 | 0.4000 | 0.1642 |
|
133 |
+
| 1.8781 | 75.0002 | 81000 | 1.8021 | 0.3695 | 0.1360 |
|
134 |
+
| 1.9189 | 76.0001 | 82000 | 1.7375 | 0.4177 | 0.1551 |
|
135 |
+
| 1.8382 | 77.0001 | 83000 | 1.8088 | 0.3697 | 0.1592 |
|
136 |
+
| 1.828 | 77.0010 | 84000 | 1.7752 | 0.4315 | 0.1585 |
|
137 |
+
| 1.8672 | 78.0009 | 85000 | 1.7715 | 0.4054 | 0.1684 |
|
138 |
+
| 1.8834 | 79.0009 | 86000 | 1.7985 | 0.3988 | 0.1603 |
|
139 |
+
| 1.783 | 80.0008 | 87000 | 1.7518 | 0.4374 | 0.1683 |
|
140 |
+
| 1.8679 | 81.0007 | 88000 | 1.7966 | 0.3770 | 0.1549 |
|
141 |
+
| 1.8818 | 82.0006 | 89000 | 1.7799 | 0.4094 | 0.1673 |
|
142 |
+
| 1.7993 | 83.0006 | 90000 | 1.7827 | 0.3770 | 0.1504 |
|
143 |
+
| 1.9272 | 84.0005 | 91000 | 1.7251 | 0.4290 | 0.1576 |
|
144 |
+
| 1.8129 | 85.0004 | 92000 | 1.7738 | 0.3877 | 0.1580 |
|
145 |
+
| 1.8326 | 86.0004 | 93000 | 1.7855 | 0.4101 | 0.1641 |
|
146 |
+
| 1.9804 | 87.0003 | 94000 | 1.7172 | 0.4276 | 0.1676 |
|
147 |
+
| 1.814 | 88.0002 | 95000 | 1.8198 | 0.3801 | 0.1560 |
|
148 |
+
|
149 |
+
|
150 |
+
### Framework versions
|
151 |
+
|
152 |
+
- Transformers 4.46.0
|
153 |
+
- Pytorch 2.3.1+cu121
|
154 |
+
- Datasets 2.20.0
|
155 |
+
- Tokenizers 0.20.1
|
model.safetensors
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 162224968
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:c3eca08aad2a87a100f61da69f3590348099f04a78a13eb1845b6596073a2331
|
3 |
size 162224968
|
runs/0-sample_rate=0.2/events.out.tfevents.1739260257.yara2.1274861.1
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:823ab67795f19d0ea0d878ca34b386a7d7d7d3ec32bdf845912fb65b85bdb28f
|
3 |
+
size 470
|