idah4 commited on
Commit
cf0568d
·
verified ·
1 Parent(s): d806837

Model save

Browse files
README.md ADDED
@@ -0,0 +1,175 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ base_model: hyunwoongko/kobart
4
+ tags:
5
+ - generated_from_trainer
6
+ model-index:
7
+ - name: qa_kor_market
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # qa_kor_market
15
+
16
+ This model is a fine-tuned version of [hyunwoongko/kobart](https://huggingface.co/hyunwoongko/kobart) on an unknown dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 0.7618
19
+
20
+ ## Model description
21
+
22
+ More information needed
23
+
24
+ ## Intended uses & limitations
25
+
26
+ More information needed
27
+
28
+ ## Training and evaluation data
29
+
30
+ More information needed
31
+
32
+ ## Training procedure
33
+
34
+ ### Training hyperparameters
35
+
36
+ The following hyperparameters were used during training:
37
+ - learning_rate: 1e-05
38
+ - train_batch_size: 16
39
+ - eval_batch_size: 16
40
+ - seed: 42
41
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
42
+ - lr_scheduler_type: linear
43
+ - lr_scheduler_warmup_steps: 400
44
+ - num_epochs: 3
45
+
46
+ ### Training results
47
+
48
+ | Training Loss | Epoch | Step | Validation Loss |
49
+ |:-------------:|:-----:|:-----:|:---------------:|
50
+ | No log | 0.03 | 100 | 3.4839 |
51
+ | No log | 0.05 | 200 | 1.4909 |
52
+ | No log | 0.08 | 300 | 1.2606 |
53
+ | No log | 0.1 | 400 | 1.1675 |
54
+ | 3.0259 | 0.13 | 500 | 1.1008 |
55
+ | 3.0259 | 0.15 | 600 | 1.0580 |
56
+ | 3.0259 | 0.18 | 700 | 1.0222 |
57
+ | 3.0259 | 0.2 | 800 | 0.9938 |
58
+ | 3.0259 | 0.23 | 900 | 0.9707 |
59
+ | 1.0853 | 0.25 | 1000 | 0.9571 |
60
+ | 1.0853 | 0.28 | 1100 | 0.9370 |
61
+ | 1.0853 | 0.3 | 1200 | 0.9293 |
62
+ | 1.0853 | 0.33 | 1300 | 0.9146 |
63
+ | 1.0853 | 0.35 | 1400 | 0.9065 |
64
+ | 0.992 | 0.38 | 1500 | 0.8997 |
65
+ | 0.992 | 0.41 | 1600 | 0.8930 |
66
+ | 0.992 | 0.43 | 1700 | 0.8834 |
67
+ | 0.992 | 0.46 | 1800 | 0.8788 |
68
+ | 0.992 | 0.48 | 1900 | 0.8714 |
69
+ | 0.9418 | 0.51 | 2000 | 0.8706 |
70
+ | 0.9418 | 0.53 | 2100 | 0.8676 |
71
+ | 0.9418 | 0.56 | 2200 | 0.8619 |
72
+ | 0.9418 | 0.58 | 2300 | 0.8548 |
73
+ | 0.9418 | 0.61 | 2400 | 0.8514 |
74
+ | 0.9222 | 0.63 | 2500 | 0.8511 |
75
+ | 0.9222 | 0.66 | 2600 | 0.8483 |
76
+ | 0.9222 | 0.68 | 2700 | 0.8425 |
77
+ | 0.9222 | 0.71 | 2800 | 0.8396 |
78
+ | 0.9222 | 0.74 | 2900 | 0.8384 |
79
+ | 0.8981 | 0.76 | 3000 | 0.8360 |
80
+ | 0.8981 | 0.79 | 3100 | 0.8295 |
81
+ | 0.8981 | 0.81 | 3200 | 0.8290 |
82
+ | 0.8981 | 0.84 | 3300 | 0.8273 |
83
+ | 0.8981 | 0.86 | 3400 | 0.8221 |
84
+ | 0.874 | 0.89 | 3500 | 0.8228 |
85
+ | 0.874 | 0.91 | 3600 | 0.8213 |
86
+ | 0.874 | 0.94 | 3700 | 0.8183 |
87
+ | 0.874 | 0.96 | 3800 | 0.8163 |
88
+ | 0.874 | 0.99 | 3900 | 0.8178 |
89
+ | 0.8575 | 1.01 | 4000 | 0.8143 |
90
+ | 0.8575 | 1.04 | 4100 | 0.8118 |
91
+ | 0.8575 | 1.06 | 4200 | 0.8094 |
92
+ | 0.8575 | 1.09 | 4300 | 0.8092 |
93
+ | 0.8575 | 1.12 | 4400 | 0.8085 |
94
+ | 0.8374 | 1.14 | 4500 | 0.8048 |
95
+ | 0.8374 | 1.17 | 4600 | 0.8041 |
96
+ | 0.8374 | 1.19 | 4700 | 0.8018 |
97
+ | 0.8374 | 1.22 | 4800 | 0.8007 |
98
+ | 0.8374 | 1.24 | 4900 | 0.7988 |
99
+ | 0.8282 | 1.27 | 5000 | 0.7980 |
100
+ | 0.8282 | 1.29 | 5100 | 0.7968 |
101
+ | 0.8282 | 1.32 | 5200 | 0.7974 |
102
+ | 0.8282 | 1.34 | 5300 | 0.7949 |
103
+ | 0.8282 | 1.37 | 5400 | 0.7919 |
104
+ | 0.8149 | 1.39 | 5500 | 0.7931 |
105
+ | 0.8149 | 1.42 | 5600 | 0.7900 |
106
+ | 0.8149 | 1.44 | 5700 | 0.7887 |
107
+ | 0.8149 | 1.47 | 5800 | 0.7875 |
108
+ | 0.8149 | 1.5 | 5900 | 0.7883 |
109
+ | 0.8098 | 1.52 | 6000 | 0.7886 |
110
+ | 0.8098 | 1.55 | 6100 | 0.7860 |
111
+ | 0.8098 | 1.57 | 6200 | 0.7873 |
112
+ | 0.8098 | 1.6 | 6300 | 0.7822 |
113
+ | 0.8098 | 1.62 | 6400 | 0.7841 |
114
+ | 0.8306 | 1.65 | 6500 | 0.7828 |
115
+ | 0.8306 | 1.67 | 6600 | 0.7817 |
116
+ | 0.8306 | 1.7 | 6700 | 0.7812 |
117
+ | 0.8306 | 1.72 | 6800 | 0.7814 |
118
+ | 0.8306 | 1.75 | 6900 | 0.7799 |
119
+ | 0.7974 | 1.77 | 7000 | 0.7774 |
120
+ | 0.7974 | 1.8 | 7100 | 0.7795 |
121
+ | 0.7974 | 1.83 | 7200 | 0.7782 |
122
+ | 0.7974 | 1.85 | 7300 | 0.7786 |
123
+ | 0.7974 | 1.88 | 7400 | 0.7773 |
124
+ | 0.7945 | 1.9 | 7500 | 0.7749 |
125
+ | 0.7945 | 1.93 | 7600 | 0.7737 |
126
+ | 0.7945 | 1.95 | 7700 | 0.7743 |
127
+ | 0.7945 | 1.98 | 7800 | 0.7742 |
128
+ | 0.7945 | 2.0 | 7900 | 0.7732 |
129
+ | 0.8005 | 2.03 | 8000 | 0.7758 |
130
+ | 0.8005 | 2.05 | 8100 | 0.7726 |
131
+ | 0.8005 | 2.08 | 8200 | 0.7716 |
132
+ | 0.8005 | 2.1 | 8300 | 0.7742 |
133
+ | 0.8005 | 2.13 | 8400 | 0.7720 |
134
+ | 0.7788 | 2.15 | 8500 | 0.7706 |
135
+ | 0.7788 | 2.18 | 8600 | 0.7701 |
136
+ | 0.7788 | 2.21 | 8700 | 0.7702 |
137
+ | 0.7788 | 2.23 | 8800 | 0.7676 |
138
+ | 0.7788 | 2.26 | 8900 | 0.7699 |
139
+ | 0.7685 | 2.28 | 9000 | 0.7689 |
140
+ | 0.7685 | 2.31 | 9100 | 0.7677 |
141
+ | 0.7685 | 2.33 | 9200 | 0.7686 |
142
+ | 0.7685 | 2.36 | 9300 | 0.7671 |
143
+ | 0.7685 | 2.38 | 9400 | 0.7668 |
144
+ | 0.7814 | 2.41 | 9500 | 0.7670 |
145
+ | 0.7814 | 2.43 | 9600 | 0.7669 |
146
+ | 0.7814 | 2.46 | 9700 | 0.7661 |
147
+ | 0.7814 | 2.48 | 9800 | 0.7653 |
148
+ | 0.7814 | 2.51 | 9900 | 0.7663 |
149
+ | 0.7824 | 2.53 | 10000 | 0.7655 |
150
+ | 0.7824 | 2.56 | 10100 | 0.7654 |
151
+ | 0.7824 | 2.59 | 10200 | 0.7653 |
152
+ | 0.7824 | 2.61 | 10300 | 0.7652 |
153
+ | 0.7824 | 2.64 | 10400 | 0.7640 |
154
+ | 0.7798 | 2.66 | 10500 | 0.7647 |
155
+ | 0.7798 | 2.69 | 10600 | 0.7637 |
156
+ | 0.7798 | 2.71 | 10700 | 0.7636 |
157
+ | 0.7798 | 2.74 | 10800 | 0.7629 |
158
+ | 0.7798 | 2.76 | 10900 | 0.7629 |
159
+ | 0.7619 | 2.79 | 11000 | 0.7629 |
160
+ | 0.7619 | 2.81 | 11100 | 0.7624 |
161
+ | 0.7619 | 2.84 | 11200 | 0.7621 |
162
+ | 0.7619 | 2.86 | 11300 | 0.7621 |
163
+ | 0.7619 | 2.89 | 11400 | 0.7623 |
164
+ | 0.7723 | 2.92 | 11500 | 0.7621 |
165
+ | 0.7723 | 2.94 | 11600 | 0.7619 |
166
+ | 0.7723 | 2.97 | 11700 | 0.7619 |
167
+ | 0.7723 | 2.99 | 11800 | 0.7618 |
168
+
169
+
170
+ ### Framework versions
171
+
172
+ - Transformers 4.38.2
173
+ - Pytorch 2.2.1+cu121
174
+ - Datasets 2.18.0
175
+ - Tokenizers 0.15.2
generation_config.json ADDED
@@ -0,0 +1,9 @@
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_from_model_config": true,
3
+ "bos_token_id": 1,
4
+ "decoder_start_token_id": 1,
5
+ "eos_token_id": 1,
6
+ "forced_eos_token_id": 1,
7
+ "pad_token_id": 3,
8
+ "transformers_version": "4.38.2"
9
+ }
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:30920bdf01e4b81a902454001d53e519a143ad053f194b72d05afd25e14318d6
3
  size 495589768
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f6b94846673fba7b88f462fd554eba0d25d9a4cc105c1df63032a5adc6459f76
3
  size 495589768
runs/Apr16_00-01-30_a5ac0e95e35a/events.out.tfevents.1713225695.a5ac0e95e35a.353.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:0f8da8744ca0eb27d10152452adc22dbba74d86a83eaf1343fc6a95ee8dedc22
3
- size 40476
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:71b474299f065ed780cd59363c3969e55dc1180f1aa65a14d78f25cd3cb5df82
3
+ size 42667