model documentation
Browse files
README.md
CHANGED
@@ -1,217 +1,237 @@
|
|
|
|
1 |
---
|
2 |
-
language:
|
3 |
-
|
4 |
-
|
|
|
5 |
tags:
|
6 |
- electra
|
7 |
- korean
|
8 |
-
license: "mit"
|
9 |
---
|
10 |
|
11 |
-
|
12 |
-
|
13 |
-
|
|
|
|
|
|
|
|
|
14 |
** Updates on 2022.10.08 **
|
15 |
-
|
16 |
- KcELECTRA-base-v2022 (๊ตฌ v2022-dev) ๋ชจ๋ธ ์ด๋ฆ์ด ๋ณ๊ฒฝ๋์์ต๋๋ค.
|
17 |
- ์ ๋ชจ๋ธ์ ์ธ๋ถ ์ค์ฝ์ด๋ฅผ ์ถ๊ฐํ์์ต๋๋ค.
|
18 |
- ๊ธฐ์กด KcELECTRA-base(v2021) ๋๋น ๋๋ถ๋ถ์ downstream task์์ ~1%p ์์ค์ ์ฑ๋ฅ ํฅ์์ด ์์ต๋๋ค.
|
19 |
-
|
20 |
---
|
21 |
-
|
22 |
๊ณต๊ฐ๋ ํ๊ตญ์ด Transformer ๊ณ์ด ๋ชจ๋ธ๋ค์ ๋๋ถ๋ถ ํ๊ตญ์ด ์ํค, ๋ด์ค ๊ธฐ์ฌ, ์ฑ
๋ฑ ์ ์ ์ ๋ ๋ฐ์ดํฐ๋ฅผ ๊ธฐ๋ฐ์ผ๋ก ํ์ตํ ๋ชจ๋ธ์
๋๋ค. ํํธ, ์ค์ ๋ก NSMC์ ๊ฐ์ User-Generated Noisy text domain ๋ฐ์ดํฐ์
์ ์ ์ ๋์ง ์์๊ณ ๊ตฌ์ด์ฒด ํน์ง์ ์ ์กฐ์ด๊ฐ ๋ง์ผ๋ฉฐ, ์คํ์ ๋ฑ ๊ณต์์ ์ธ ๊ธ์ฐ๊ธฐ์์ ๋ํ๋์ง ์๋ ํํ๋ค์ด ๋น๋ฒํ๊ฒ ๋ฑ์ฅํฉ๋๋ค.
|
23 |
-
|
24 |
KcELECTRA๋ ์์ ๊ฐ์ ํน์ฑ์ ๋ฐ์ดํฐ์
์ ์ ์ฉํ๊ธฐ ์ํด, ๋ค์ด๋ฒ ๋ด์ค์์ ๋๊ธ๊ณผ ๋๋๊ธ์ ์์งํด, ํ ํฌ๋์ด์ ์ ELECTRA๋ชจ๋ธ์ ์ฒ์๋ถํฐ ํ์ตํ Pretrained ELECTRA ๋ชจ๋ธ์
๋๋ค.
|
25 |
-
|
26 |
๊ธฐ์กด KcBERT ๋๋น ๋ฐ์ดํฐ์
์ฆ๊ฐ ๋ฐ vocab ํ์ฅ์ ํตํด ์๋นํ ์์ค์ผ๋ก ์ฑ๋ฅ์ด ํฅ์๋์์ต๋๋ค.
|
27 |
-
|
28 |
KcELECTRA๋ Huggingface์ Transformers ๋ผ์ด๋ธ๋ฌ๋ฆฌ๋ฅผ ํตํด ๊ฐํธํ ๋ถ๋ฌ์ ์ฌ์ฉํ ์ ์์ต๋๋ค. (๋ณ๋์ ํ์ผ ๋ค์ด๋ก๋๊ฐ ํ์ํ์ง ์์ต๋๋ค.)
|
29 |
-
|
30 |
-
|
31 |
-
|
32 |
-
|
33 |
-
|
34 |
-
|
35 |
-
|
36 |
-
|
37 |
-
|
38 |
-
-
|
39 |
-
-
|
40 |
-
|
41 |
-
|
42 |
-
|
43 |
-
|
44 |
-
|
45 |
-
|
46 |
-
|
47 |
-
|
48 |
-
|
49 |
-
|
50 |
-
|
51 |
-
|
52 |
-
|
53 |
-
|
54 |
-
|
55 |
-
|
56 |
-
|
57 |
-
|
58 |
-
|
59 |
-
|
60 |
-
|
61 |
-
|
62 |
-
|
63 |
-
|
64 |
-
|
65 |
-
|
66 |
-
|
67 |
-
|
68 |
-
|
69 |
-
|
70 |
-
|
71 |
-
|
72 |
-
|
73 |
-
|
74 |
-
|
75 |
-
|
76 |
-
|
77 |
-
|
78 |
-
|
79 |
-
|
80 |
-
|
81 |
-
|
82 |
-
#### Pretrain Data
|
83 |
-
|
84 |
-
- KcBERTํ์ต์ ์ฌ์ฉํ ๋ฐ์ดํฐ + ์ดํ 2021.03์ ์ด๊น์ง ์์งํ ๋๊ธ
|
85 |
-
- ์ฝ 17GB
|
86 |
-
- ๋๊ธ-๋๋๊ธ์ ๋ฌถ์ ๊ธฐ๋ฐ์ผ๋ก Document ๊ตฌ์ฑ
|
87 |
-
|
88 |
-
#### Pretrain Code
|
89 |
-
|
90 |
-
- https://github.com/KLUE-benchmark/KLUE-ELECTRA Repo๋ฅผ ํตํ Pretrain
|
91 |
-
|
92 |
-
#### Finetune Code
|
93 |
-
|
94 |
-
- https://github.com/Beomi/KcBERT-finetune Repo๋ฅผ ํตํ Finetune ๋ฐ ์ค์ฝ์ด ๋น๊ต
|
95 |
-
|
96 |
#### Finetune Samples
|
97 |
-
|
98 |
- NSMC with PyTorch-Lightning 1.3.0, GPU, Colab <a href="https://colab.research.google.com/drive/1Hh63kIBAiBw3Hho--BvfdUWLu-ysMFF0?usp=sharing">
|
99 |
<img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/>
|
100 |
</a>
|
101 |
-
|
102 |
-
|
103 |
-
|
104 |
-
|
105 |
-
|
106 |
-
|
107 |
-
ํ์ต ๋ฐ์ดํฐ๋ 2019.01.01 ~ 2021.03.09 ์ฌ์ด์ ์์ฑ๋ **๋๊ธ ๋ง์ ๋ด์ค/ํน์ ์ ์ฒด ๋ด์ค** ๊ธฐ์ฌ๋ค์ **๋๊ธ๊ณผ ๋๋๊ธ**์ ๋ชจ๋ ์์งํ ๋ฐ์ดํฐ์
๋๋ค.
|
108 |
-
|
109 |
-
๋ฐ์ดํฐ ์ฌ์ด์ฆ๋ ํ
์คํธ๋ง ์ถ์ถ์ **์ฝ 17.3GB์ด๋ฉฐ, 1์ต8์ฒ๋ง๊ฐ ์ด์์ ๋ฌธ์ฅ**์ผ๋ก ์ด๋ค์ ธ ์์ต๋๋ค.
|
110 |
-
|
111 |
-
> KcBERT๋ 2019.01-2020.06์ ํ
์คํธ๋ก, ์ ์ ํ ์ฝ 9์ฒ๋ง๊ฐ ๋ฌธ์ฅ์ผ๋ก ํ์ต์ ์งํํ์ต๋๋ค.
|
112 |
-
|
113 |
### Preprocessing
|
114 |
-
|
115 |
PLM ํ์ต์ ์ํด์ ์ ์ฒ๋ฆฌ๋ฅผ ์งํํ ๊ณผ์ ์ ๋ค์๊ณผ ๊ฐ์ต๋๋ค.
|
116 |
-
|
117 |
1. ํ๊ธ ๋ฐ ์์ด, ํน์๋ฌธ์, ๊ทธ๋ฆฌ๊ณ ์ด๋ชจ์ง(๐ฅณ)๊น์ง!
|
118 |
-
|
119 |
์ ๊ทํํ์์ ํตํด ํ๊ธ, ์์ด, ํน์๋ฌธ์๋ฅผ ํฌํจํด Emoji๊น์ง ํ์ต ๋์์ ํฌํจํ์ต๋๋ค.
|
120 |
-
|
121 |
ํํธ, ํ๊ธ ๋ฒ์๋ฅผ `ใฑ-ใ
๊ฐ-ํฃ` ์ผ๋ก ์ง์ ํด `ใฑ-ํฃ` ๋ด์ ํ์๋ฅผ ์ ์ธํ์ต๋๋ค.
|
122 |
-
|
123 |
2. ๋๊ธ ๋ด ์ค๋ณต ๋ฌธ์์ด ์ถ์ฝ
|
124 |
-
|
125 |
`ใ
ใ
ใ
ใ
ใ
`์ ๊ฐ์ด ์ค๋ณต๋ ๊ธ์๋ฅผ `ใ
ใ
`์ ๊ฐ์ ๊ฒ์ผ๋ก ํฉ์ณค์ต๋๋ค.
|
126 |
-
|
127 |
3. Cased Model
|
128 |
-
|
129 |
KcBERT๋ ์๋ฌธ์ ๋ํด์๋ ๋์๋ฌธ์๋ฅผ ์ ์งํ๋ Cased model์
๋๋ค.
|
130 |
-
|
131 |
4. ๊ธ์ ๋จ์ 10๊ธ์ ์ดํ ์ ๊ฑฐ
|
132 |
-
|
133 |
10๊ธ์ ๋ฏธ๋ง์ ํ
์คํธ๋ ๋จ์ผ ๋จ์ด๋ก ์ด๋ค์ง ๊ฒฝ์ฐ๊ฐ ๋ง์ ํด๋น ๋ถ๋ถ์ ์ ์ธํ์ต๋๋ค.
|
134 |
-
|
135 |
5. ์ค๋ณต ์ ๊ฑฐ
|
136 |
-
|
137 |
์ค๋ณต์ ์ผ๋ก ์ฐ์ธ ๋๊ธ์ ์ ๊ฑฐํ๊ธฐ ์ํด ์์ ํ ์ผ์นํ๋ ์ค๋ณต ๋๊ธ์ ํ๋๋ก ํฉ์ณค์ต๋๋ค.
|
138 |
-
|
139 |
6. `OOO` ์ ๊ฑฐ
|
140 |
-
|
141 |
๋ค์ด๋ฒ ๋๊ธ์ ๊ฒฝ์ฐ, ๋น์์ด๋ ์์ฒด ํํฐ๋ง์ ํตํด `OOO` ๋ก ํ์ํฉ๋๋ค. ์ด ๋ถ๋ถ์ ๊ณต๋ฐฑ์ผ๋ก ์ ๊ฑฐํ์์ต๋๋ค.
|
142 |
-
|
143 |
-
|
144 |
-
|
145 |
-
|
146 |
-
|
147 |
-
|
148 |
-
|
149 |
-
|
150 |
-
|
151 |
-
|
152 |
-
|
153 |
-
|
154 |
-
|
155 |
-
|
156 |
-
|
157 |
-
|
158 |
-
|
159 |
-
|
160 |
-
|
161 |
-
import re
|
162 |
-
import emoji
|
163 |
-
from soynlp.normalizer import repeat_normalize
|
164 |
-
|
165 |
-
pattern = re.compile(f'[^ .,?!/@$%~๏ผ
ยทโผ()\x00-\x7Fใฑ-ใ
ฃ๊ฐ-ํฃ]+')
|
166 |
-
url_pattern = re.compile(
|
167 |
-
r'https?:\/\/(www\.)?[-a-zA-Z0-9@:%._\+~#=]{1,256}\.[a-zA-Z0-9()]{1,6}\b([-a-zA-Z0-9()@:%_\+.~#?&//=]*)')
|
168 |
-
|
169 |
-
def clean(x):
|
170 |
-
x = pattern.sub(' ', x)
|
171 |
-
x = emoji.replace_emoji(x, replace='') #emoji ์ญ์
|
172 |
-
x = url_pattern.sub('', x)
|
173 |
-
x = x.strip()
|
174 |
-
x = repeat_normalize(x, num_repeats=2)
|
175 |
-
return x
|
176 |
-
```
|
177 |
-
|
178 |
-
> ๐ก Finetune Score์์๋ ์ `clean` ํจ์๋ฅผ ์ ์ฉํ์ง ์์์ต๋๋ค.
|
179 |
-
|
180 |
-
### Cleaned Data
|
181 |
-
|
182 |
- KcBERT ์ธ ์ถ๊ฐ ๋ฐ์ดํฐ๋ ์ ๋ฆฌ ํ ๊ณต๊ฐ ์์ ์
๋๋ค.
|
183 |
-
|
184 |
-
|
185 |
-
|
186 |
-
|
187 |
-
|
188 |
-
|
189 |
-
|
190 |
-
|
191 |
-
|
192 |
-
|
193 |
-
|
194 |
-
|
195 |
(100k step๋ณ Checkpoint๋ฅผ ํตํด ์ฑ๋ฅ ํ๊ฐ๋ฅผ ์งํํ์์ต๋๋ค. ํด๋น ๋ถ๋ถ์ `KcBERT-finetune` repo๋ฅผ ์ฐธ๊ณ ํด์ฃผ์ธ์.)
|
196 |
-
|
197 |
๋ชจ๋ธ ํ์ต Loss๋ Step์ ๋ฐ๋ผ ์ด๊ธฐ 100-200k ์ฌ์ด์ ๊ธ๊ฒฉํ Loss๊ฐ ์ค์ด๋ค๋ค ํ์ต ์ข
๋ฃ๊น์ง๋ ์ง์์ ์ผ๋ก loss๊ฐ ๊ฐ์ํ๋ ๊ฒ์ ๋ณผ ์ ์์ต๋๋ค.
|
198 |
-
|
199 |
![KcELECTRA-base Pretrain Loss](https://cdn.jsdelivr.net/gh/beomi/blog-img@master/2021/04/07/image-20210407201231133.png)
|
200 |
-
|
201 |
### KcELECTRA Pretrain Step๋ณ Downstream task ์ฑ๋ฅ ๋น๊ต
|
202 |
-
|
203 |
> ๐ก ์๋ ํ๋ ์ ์ฒด ckpt๊ฐ ์๋ ์ผ๋ถ์ ๋ํด์๋ง ํ
์คํธ๋ฅผ ์งํํ ๊ฒฐ๊ณผ์
๋๋ค.
|
204 |
-
|
205 |
![KcELECTRA Pretrain Step๋ณ Downstream task ์ฑ๋ฅ ๋น๊ต](https://cdn.jsdelivr.net/gh/beomi/blog-img@master/2021/04/07/image-20210407215557039.png)
|
206 |
-
|
207 |
- ์์ ๊ฐ์ด KcBERT-base, KcBERT-large ๋๋น **๋ชจ๋ ๋ฐ์ดํฐ์
์ ๋ํด** KcELECTRA-base๊ฐ ๋ ๋์ ์ฑ๋ฅ์ ๋ณด์
๋๋ค.
|
208 |
- KcELECTRA pretrain์์๋ Train step์ด ๋์ด๊ฐ์ ๋ฐ๋ผ ์ ์ง์ ์ผ๋ก ์ฑ๋ฅ์ด ํฅ์๋๋ ๊ฒ์ ๋ณผ ์ ์์ต๋๋ค.
|
209 |
-
|
210 |
-
|
211 |
-
|
212 |
-
|
213 |
-
|
214 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
215 |
@misc{lee2021kcelectra,
|
216 |
author = {Junbum Lee},
|
217 |
title = {KcELECTRA: Korean comments ELECTRA},
|
@@ -220,20 +240,29 @@ KcELECTRA๋ฅผ ์ธ์ฉํ์ค ๋๋ ์๋ ์์์ ํตํด ์ธ์ฉํด์ฃผ์ธ์.
|
|
220 |
journal = {GitHub repository},
|
221 |
howpublished = {\url{https://github.com/Beomi/KcELECTRA}}
|
222 |
}
|
|
|
223 |
```
|
224 |
-
|
225 |
๋
ผ๋ฌธ์ ํตํ ์ฌ์ฉ ์ธ์๋ MIT ๋ผ์ด์ผ์ค๋ฅผ ํ๊ธฐํด์ฃผ์ธ์. โบ๏ธ
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
226 |
|
227 |
## Acknowledgement
|
228 |
-
|
229 |
KcELECTRA Model์ ํ์ตํ๋ GCP/TPU ํ๊ฒฝ์ [TFRC](https://www.tensorflow.org/tfrc?hl=ko) ํ๋ก๊ทธ๋จ์ ์ง์์ ๋ฐ์์ต๋๋ค.
|
230 |
-
|
231 |
๋ชจ๋ธ ํ์ต ๊ณผ์ ์์ ๋ง์ ์กฐ์ธ์ ์ฃผ์ [Monologg](https://github.com/monologg/) ๋ ๊ฐ์ฌํฉ๋๋ค :)
|
232 |
-
|
233 |
-
## Reference
|
234 |
-
|
235 |
### Github Repos
|
236 |
-
|
237 |
- [KcBERT by Beomi](https://github.com/Beomi/KcBERT)
|
238 |
- [BERT by Google](https://github.com/google-research/bert)
|
239 |
- [KoBERT by SKT](https://github.com/SKTBrain/KoBERT)
|
@@ -241,8 +270,57 @@ KcELECTRA Model์ ํ์ตํ๋ GCP/TPU ํ๊ฒฝ์ [TFRC](https://www.tensorflow.
|
|
241 |
- [Transformers by Huggingface](https://github.com/huggingface/transformers)
|
242 |
- [Tokenizers by Hugginface](https://github.com/huggingface/tokenizers)
|
243 |
- [ELECTRA train code by KLUE](https://github.com/KLUE-benchmark/KLUE-ELECTRA)
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
244 |
|
245 |
-
### Blogs
|
246 |
-
|
247 |
-
- [Monologg๋์ KoELECTRA ํ์ต๊ธฐ](https://monologg.kr/categories/NLP/ELECTRA/)
|
248 |
-
- [Colab์์ TPU๋ก BERT ์ฒ์๋ถํฐ ํ์ต์ํค๊ธฐ - Tensorflow/Google ver.](https://beomi.github.io/2020/02/26/Train-BERT-from-scratch-on-colab-TPU-Tensorflow-ver/)
|
|
|
1 |
+
|
2 |
---
|
3 |
+
language:
|
4 |
+
- ko
|
5 |
+
- en
|
6 |
+
license: mit
|
7 |
tags:
|
8 |
- electra
|
9 |
- korean
|
|
|
10 |
---
|
11 |
|
12 |
+
# Model Card for KcELECTRA: Korean comments ELECTRA
|
13 |
+
|
14 |
+
|
15 |
+
# Model Details
|
16 |
+
|
17 |
+
## Model Description
|
18 |
+
|
19 |
** Updates on 2022.10.08 **
|
20 |
+
|
21 |
- KcELECTRA-base-v2022 (๊ตฌ v2022-dev) ๋ชจ๋ธ ์ด๋ฆ์ด ๋ณ๊ฒฝ๋์์ต๋๋ค.
|
22 |
- ์ ๋ชจ๋ธ์ ์ธ๋ถ ์ค์ฝ์ด๋ฅผ ์ถ๊ฐํ์์ต๋๋ค.
|
23 |
- ๊ธฐ์กด KcELECTRA-base(v2021) ๋๋น ๋๋ถ๋ถ์ downstream task์์ ~1%p ์์ค์ ์ฑ๋ฅ ํฅ์์ด ์์ต๋๋ค.
|
24 |
+
|
25 |
---
|
26 |
+
|
27 |
๊ณต๊ฐ๋ ํ๊ตญ์ด Transformer ๊ณ์ด ๋ชจ๋ธ๋ค์ ๋๋ถ๋ถ ํ๊ตญ์ด ์ํค, ๋ด์ค ๊ธฐ์ฌ, ์ฑ
๋ฑ ์ ์ ์ ๋ ๋ฐ์ดํฐ๋ฅผ ๊ธฐ๋ฐ์ผ๋ก ํ์ตํ ๋ชจ๋ธ์
๋๋ค. ํํธ, ์ค์ ๋ก NSMC์ ๊ฐ์ User-Generated Noisy text domain ๋ฐ์ดํฐ์
์ ์ ์ ๋์ง ์์๊ณ ๊ตฌ์ด์ฒด ํน์ง์ ์ ์กฐ์ด๊ฐ ๋ง์ผ๋ฉฐ, ์คํ์ ๋ฑ ๊ณต์์ ์ธ ๊ธ์ฐ๊ธฐ์์ ๋ํ๋์ง ์๋ ํํ๋ค์ด ๋น๋ฒํ๊ฒ ๋ฑ์ฅํฉ๋๋ค.
|
28 |
+
|
29 |
KcELECTRA๋ ์์ ๊ฐ์ ํน์ฑ์ ๋ฐ์ดํฐ์
์ ์ ์ฉํ๊ธฐ ์ํด, ๋ค์ด๋ฒ ๋ด์ค์์ ๋๊ธ๊ณผ ๋๋๊ธ์ ์์งํด, ํ ํฌ๋์ด์ ์ ELECTRA๋ชจ๋ธ์ ์ฒ์๋ถํฐ ํ์ตํ Pretrained ELECTRA ๋ชจ๋ธ์
๋๋ค.
|
30 |
+
|
31 |
๊ธฐ์กด KcBERT ๋๋น ๋ฐ์ดํฐ์
์ฆ๊ฐ ๋ฐ vocab ํ์ฅ์ ํตํด ์๋นํ ์์ค์ผ๋ก ์ฑ๋ฅ์ด ํฅ์๋์์ต๋๋ค.
|
32 |
+
|
33 |
KcELECTRA๋ Huggingface์ Transformers ๋ผ์ด๋ธ๋ฌ๋ฆฌ๋ฅผ ํตํด ๊ฐํธํ ๋ถ๋ฌ์ ์ฌ์ฉํ ์ ์์ต๋๋ค. (๋ณ๋์ ํ์ผ ๋ค์ด๋ก๋๊ฐ ํ์ํ์ง ์์ต๋๋ค.)
|
34 |
+
|
35 |
+
|
36 |
+
|
37 |
+
|
38 |
+
|
39 |
+
- **Developed by:** Junbum Lee
|
40 |
+
- **Shared by [Optional]:** Hugging Face
|
41 |
+
- **Model type:** electra
|
42 |
+
- **Language(s) (NLP):** en
|
43 |
+
- **License:** MIT
|
44 |
+
- **Related Models:**
|
45 |
+
- **Parent Model:** Electra
|
46 |
+
- **Resources for more information:**
|
47 |
+
- [GitHub Repo](https://github.com/Beomi/KcBERT-finetune )
|
48 |
+
- [Model Space](https://huggingface.co/spaces/BeMerciless/korean_malicious_comment)
|
49 |
+
- [Blog Post](ttps://monologg.kr/categories/NLP/ELECTRA/)
|
50 |
+
|
51 |
+
# Uses
|
52 |
+
|
53 |
+
|
54 |
+
## Direct Use
|
55 |
+
|
56 |
+
This model can be used for the task of
|
57 |
+
|
58 |
+
## Downstream Use [Optional]
|
59 |
+
|
60 |
+
More information needed
|
61 |
+
|
62 |
+
## Out-of-Scope Use
|
63 |
+
|
64 |
+
The model should not be used to intentionally create hostile or alienating environments for people.
|
65 |
+
|
66 |
+
# Bias, Risks, and Limitations
|
67 |
+
|
68 |
+
Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)). Predictions generated by the model may include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups.
|
69 |
+
|
70 |
+
|
71 |
+
## Recommendations
|
72 |
+
|
73 |
+
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
|
74 |
+
|
75 |
+
|
76 |
+
# Training Details
|
77 |
+
|
78 |
+
## Training Data
|
79 |
+
|
80 |
+
ํ์ต ๋ฐ์ดํฐ๋ 2019.01.01 ~ 2021.03.09 ์ฌ์ด์ ์์ฑ๋ **๋๊ธ ๋ง์ ๋ด์ค/ํน์ ์ ์ฒด ๋ด์ค** ๊ธฐ์ฌ๋ค์ **๋๊ธ๊ณผ ๋๋๊ธ**์ ๋ชจ๋ ์์งํ ๋ฐ์ดํฐ์
๋๋ค.
|
81 |
+
|
82 |
+
๋ฐ์ดํฐ ์ฌ์ด์ฆ๋ ํ
์คํธ๋ง ์ถ์ถ์ **์ฝ 17.3GB์ด๋ฉฐ, 1์ต8์ฒ๋ง๊ฐ ์ด์์ ๋ฌธ์ฅ**์ผ๋ก ์ด๋ค์ ธ ์์ต๋๋ค.
|
83 |
+
|
84 |
+
> KcBERT๋ 2019.01-2020.06์ ํ
์คํธ๋ก, ์ ์ ํ ์ฝ 9์ฒ๋ง๊ฐ ๋ฌธ์ฅ์ผ๋ก ํ์ต์ ์งํํ์ต๋๋ค.
|
85 |
+
|
86 |
+
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
87 |
#### Finetune Samples
|
88 |
+
|
89 |
- NSMC with PyTorch-Lightning 1.3.0, GPU, Colab <a href="https://colab.research.google.com/drive/1Hh63kIBAiBw3Hho--BvfdUWLu-ysMFF0?usp=sharing">
|
90 |
<img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/>
|
91 |
</a>
|
92 |
+
|
93 |
+
|
94 |
+
|
95 |
+
## Training Procedure
|
96 |
+
|
97 |
+
|
|
|
|
|
|
|
|
|
|
|
|
|
98 |
### Preprocessing
|
99 |
+
|
100 |
PLM ํ์ต์ ์ํด์ ์ ์ฒ๋ฆฌ๋ฅผ ์งํํ ๊ณผ์ ์ ๋ค์๊ณผ ๊ฐ์ต๋๋ค.
|
101 |
+
|
102 |
1. ํ๊ธ ๋ฐ ์์ด, ํน์๋ฌธ์, ๊ทธ๋ฆฌ๊ณ ์ด๋ชจ์ง(๐ฅณ)๊น์ง!
|
103 |
+
|
104 |
์ ๊ทํํ์์ ํตํด ํ๊ธ, ์์ด, ํน์๋ฌธ์๋ฅผ ํฌํจํด Emoji๊น์ง ํ์ต ๋์์ ํฌํจํ์ต๋๋ค.
|
105 |
+
|
106 |
ํํธ, ํ๊ธ ๋ฒ์๋ฅผ `ใฑ-ใ
๊ฐ-ํฃ` ์ผ๋ก ์ง์ ํด `ใฑ-ํฃ` ๋ด์ ํ์๋ฅผ ์ ์ธํ์ต๋๋ค.
|
107 |
+
|
108 |
2. ๋๊ธ ๋ด ์ค๋ณต ๋ฌธ์์ด ์ถ์ฝ
|
109 |
+
|
110 |
`ใ
ใ
ใ
ใ
ใ
`์ ๊ฐ์ด ์ค๋ณต๋ ๊ธ์๋ฅผ `ใ
ใ
`์ ๊ฐ์ ๊ฒ์ผ๋ก ํฉ์ณค์ต๋๋ค.
|
111 |
+
|
112 |
3. Cased Model
|
113 |
+
|
114 |
KcBERT๋ ์๋ฌธ์ ๋ํด์๋ ๋์๋ฌธ์๋ฅผ ์ ์งํ๋ Cased model์
๋๋ค.
|
115 |
+
|
116 |
4. ๊ธ์ ๋จ์ 10๊ธ์ ์ดํ ์ ๊ฑฐ
|
117 |
+
|
118 |
10๊ธ์ ๋ฏธ๋ง์ ํ
์คํธ๋ ๋จ์ผ ๋จ์ด๋ก ์ด๋ค์ง ๊ฒฝ์ฐ๊ฐ ๋ง์ ํด๋น ๋ถ๋ถ์ ์ ์ธํ์ต๋๋ค.
|
119 |
+
|
120 |
5. ์ค๋ณต ์ ๊ฑฐ
|
121 |
+
|
122 |
์ค๋ณต์ ์ผ๋ก ์ฐ์ธ ๋๊ธ์ ์ ๊ฑฐํ๊ธฐ ์ํด ์์ ํ ์ผ์นํ๋ ์ค๋ณต ๋๊ธ์ ํ๋๋ก ํฉ์ณค์ต๋๋ค.
|
123 |
+
|
124 |
6. `OOO` ์ ๊ฑฐ
|
125 |
+
|
126 |
๋ค์ด๋ฒ ๋๊ธ์ ๊ฒฝ์ฐ, ๋น์์ด๋ ์์ฒด ํํฐ๋ง์ ํตํด `OOO` ๋ก ํ์ํฉ๋๋ค. ์ด ๋ถ๋ถ์ ๊ณต๋ฐฑ์ผ๋ก ์ ๊ฑฐํ์์ต๋๋ค.
|
127 |
+
|
128 |
+
|
129 |
+
|
130 |
+
|
131 |
+
|
132 |
+
### Speeds, Sizes, Times
|
133 |
+
|
134 |
+
More information needed
|
135 |
+
|
136 |
+
# Evaluation
|
137 |
+
|
138 |
+
|
139 |
+
## Testing Data, Factors & Metrics
|
140 |
+
|
141 |
+
### Testing Data
|
142 |
+
|
143 |
+
#### Cleaned Data
|
144 |
+
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
145 |
- KcBERT ์ธ ์ถ๊ฐ ๋ฐ์ดํฐ๋ ์ ๋ฆฌ ํ ๊ณต๊ฐ ์์ ์
๋๋ค.
|
146 |
+
|
147 |
+
|
148 |
+
### Factors
|
149 |
+
|
150 |
+
|
151 |
+
### Metrics
|
152 |
+
|
153 |
+
More information needed
|
154 |
+
## Results
|
155 |
+
|
156 |
+
|
|
|
157 |
(100k step๋ณ Checkpoint๋ฅผ ํตํด ์ฑ๋ฅ ํ๊ฐ๋ฅผ ์งํํ์์ต๋๋ค. ํด๋น ๋ถ๋ถ์ `KcBERT-finetune` repo๋ฅผ ์ฐธ๊ณ ํด์ฃผ์ธ์.)
|
158 |
+
|
159 |
๋ชจ๋ธ ํ์ต Loss๋ Step์ ๋ฐ๋ผ ์ด๊ธฐ 100-200k ์ฌ์ด์ ๊ธ๊ฒฉํ Loss๊ฐ ์ค์ด๋ค๋ค ํ์ต ์ข
๋ฃ๊น์ง๋ ์ง์์ ์ผ๋ก loss๊ฐ ๊ฐ์ํ๋ ๊ฒ์ ๋ณผ ์ ์์ต๋๋ค.
|
160 |
+
|
161 |
![KcELECTRA-base Pretrain Loss](https://cdn.jsdelivr.net/gh/beomi/blog-img@master/2021/04/07/image-20210407201231133.png)
|
162 |
+
|
163 |
### KcELECTRA Pretrain Step๋ณ Downstream task ์ฑ๋ฅ ๋น๊ต
|
164 |
+
|
165 |
> ๐ก ์๋ ํ๋ ์ ์ฒด ckpt๊ฐ ์๋ ์ผ๋ถ์ ๋ํด์๋ง ํ
์คํธ๋ฅผ ์งํํ ๊ฒฐ๊ณผ์
๋๋ค.
|
166 |
+
|
167 |
![KcELECTRA Pretrain Step๋ณ Downstream task ์ฑ๋ฅ ๋น๊ต](https://cdn.jsdelivr.net/gh/beomi/blog-img@master/2021/04/07/image-20210407215557039.png)
|
168 |
+
|
169 |
- ์์ ๊ฐ์ด KcBERT-base, KcBERT-large ๋๋น **๋ชจ๋ ๋ฐ์ดํฐ์
์ ๋ํด** KcELECTRA-base๊ฐ ๋ ๋์ ์ฑ๋ฅ์ ๋ณด์
๋๋ค.
|
170 |
- KcELECTRA pretrain์์๋ Train step์ด ๋์ด๊ฐ์ ๋ฐ๋ผ ์ ์ง์ ์ผ๋ก ์ฑ๋ฅ์ด ํฅ์๋๋ ๊ฒ์ ๋ณผ ์ ์์ต๋๋ค.
|
171 |
+
|
172 |
+
|
173 |
+
|
174 |
+
\***config์ ์ธํ
์ ๊ทธ๋๋ก ํ์ฌ ๋๋ฆฐ ๊ฒฐ๊ณผ์ด๋ฉฐ, hyperparameter tuning์ ์ถ๊ฐ์ ์ผ๋ก ํ ์ ๋ ์ข์ ์ฑ๋ฅ์ด ๋์ฌ ์ ์์ต๋๋ค.**
|
175 |
+
|
176 |
+
|
177 |
+
| | Size<br/>(์ฉ๋) | **NSMC**<br/>(acc) | **Naver NER**<br/>(F1) | **PAWS**<br/>(acc) | **KorNLI**<br/>(acc) | **KorSTS**<br/>(spearman) | **Question Pair**<br/>(acc) | **KorQuaD (Dev)**<br/>(EM/F1) |
|
178 |
+
| :----------------- | :-------------: | :----------------: | :--------------------: | :----------------: | :------------------: | :-----------------------: | :-------------------------: | :---------------------------: |
|
179 |
+
| **KcELECTRA-base-v2022** | 475M | **91.97** | 87.35 | 76.50 | 82.12 | 83.67 | 95.12 | 69.00 / 90.40 |
|
180 |
+
| **KcELECTRA-base** | 475M | 91.71 | 86.90 | 74.80 | 81.65 | 82.65 | **95.78** | 70.60 / 90.11 |
|
181 |
+
| KcBERT-Base | 417M | 89.62 | 84.34 | 66.95 | 74.85 | 75.57 | 93.93 | 60.25 / 84.39 |
|
182 |
+
| KcBERT-Large | 1.2G | 90.68 | 85.53 | 70.15 | 76.99 | 77.49 | 94.06 | 62.16 / 86.64 |
|
183 |
+
| KoBERT | 351M | 89.63 | 86.11 | 80.65 | 79.00 | 79.64 | 93.93 | 52.81 / 80.27 |
|
184 |
+
| XLM-Roberta-Base | 1.03G | 89.49 | 86.26 | 82.95 | 79.92 | 79.09 | 93.53 | 64.70 / 88.94 |
|
185 |
+
| HanBERT | 614M | 90.16 | 87.31 | 82.40 | 80.89 | 83.33 | 94.19 | 78.74 / 92.02 |
|
186 |
+
| KoELECTRA-Base | 423M | 90.21 | 86.87 | 81.90 | 80.85 | 83.21 | 94.20 | 61.10 / 89.59 |
|
187 |
+
| KoELECTRA-Base-v2 | 423M | 89.70 | 87.02 | 83.90 | 80.61 | 84.30 | 94.72 | 84.34 / 92.58 |
|
188 |
+
| KoELECTRA-Base-v3 | 423M | 90.63 | **88.11** | **84.45** | **82.24** | **85.53** | 95.25 | **84.83 / 93.45** |
|
189 |
+
| DistilKoBERT | 108M | 88.41 | 84.13 | 62.55 | 70.55 | 73.21 | 92.48 | 54.12 / 77.80 |
|
190 |
+
|
191 |
+
|
192 |
+
|
193 |
+
# Model Examination
|
194 |
+
|
195 |
+
More information needed
|
196 |
+
|
197 |
+
# Environmental Impact
|
198 |
+
|
199 |
+
|
200 |
+
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
|
201 |
+
|
202 |
+
- **Hardware Type:** TPU `v3-8`
|
203 |
+
- **Hours used:** 240 (10 days)
|
204 |
+
- **Cloud Provider:** More information needed
|
205 |
+
- **Compute Region:** More information needed
|
206 |
+
- **Carbon Emitted:** More information needed
|
207 |
+
|
208 |
+
# Technical Specifications [optional]
|
209 |
+
|
210 |
+
## Model Architecture and Objective
|
211 |
+
|
212 |
+
More information needed
|
213 |
+
|
214 |
+
## Compute Infrastructure
|
215 |
+
|
216 |
+
More information needed
|
217 |
+
|
218 |
+
### Hardware
|
219 |
+
|
220 |
+
TPU `v3-8` ์ ์ด์ฉํด ์ฝ 10์ผ ํ์ต์ ์งํํ๊ณ , ํ์ฌ Huggingface์ ๊ณต๊ฐ๋ ๋ชจ๋ธ์ 848k step์ ํ์ตํ ๋ชจ๋ธ weight๊ฐ ์
๋ก๋ ๋์ด์์ต๋๋ค.
|
221 |
+
|
222 |
+
### Software
|
223 |
+
- `pytorch ~= 1.8.0`
|
224 |
+
- `transformers ~= 4.11.3`
|
225 |
+
- `emoji ~= 0.6.0`
|
226 |
+
- `soynlp ~= 0.0.493`
|
227 |
+
|
228 |
+
|
229 |
+
# Citation
|
230 |
+
|
231 |
+
|
232 |
+
**BibTeX:**
|
233 |
+
```
|
234 |
+
|
235 |
@misc{lee2021kcelectra,
|
236 |
author = {Junbum Lee},
|
237 |
title = {KcELECTRA: Korean comments ELECTRA},
|
|
|
240 |
journal = {GitHub repository},
|
241 |
howpublished = {\url{https://github.com/Beomi/KcELECTRA}}
|
242 |
}
|
243 |
+
|
244 |
```
|
|
|
245 |
๋
ผ๋ฌธ์ ํตํ ์ฌ์ฉ ์ธ์๋ MIT ๋ผ์ด์ผ์ค๋ฅผ ํ๊ธฐํด์ฃผ์ธ์. โบ๏ธ
|
246 |
+
|
247 |
+
# Glossary [optional]
|
248 |
+
More information needed
|
249 |
+
|
250 |
+
# More Information [optional]
|
251 |
+
|
252 |
+
```
|
253 |
+
๐ก NOTE ๐ก
|
254 |
+
General Corpus๋ก ํ์ตํ KoELECTRA๊ฐ ๋ณดํธ์ ์ธ task์์๋ ์ฑ๋ฅ์ด ๋ ์ ๋์ฌ ๊ฐ๋ฅ์ฑ์ด ๋์ต๋๋ค.
|
255 |
+
KcBERT/KcELECTRA๋ User genrated, Noisy text์ ๋ํด์ ๋ณด๋ค ์ ๋์ํ๋ PLM์
๋๋ค.
|
256 |
+
```
|
257 |
|
258 |
## Acknowledgement
|
259 |
+
|
260 |
KcELECTRA Model์ ํ์ตํ๋ GCP/TPU ํ๊ฒฝ์ [TFRC](https://www.tensorflow.org/tfrc?hl=ko) ํ๋ก๊ทธ๋จ์ ์ง์์ ๋ฐ์์ต๋๋ค.
|
261 |
+
|
262 |
๋ชจ๋ธ ํ์ต ๊ณผ์ ์์ ๋ง์ ์กฐ์ธ์ ์ฃผ์ [Monologg](https://github.com/monologg/) ๋ ๊ฐ์ฌํฉ๋๋ค :)
|
263 |
+
|
|
|
|
|
264 |
### Github Repos
|
265 |
+
|
266 |
- [KcBERT by Beomi](https://github.com/Beomi/KcBERT)
|
267 |
- [BERT by Google](https://github.com/google-research/bert)
|
268 |
- [KoBERT by SKT](https://github.com/SKTBrain/KoBERT)
|
|
|
270 |
- [Transformers by Huggingface](https://github.com/huggingface/transformers)
|
271 |
- [Tokenizers by Hugginface](https://github.com/huggingface/tokenizers)
|
272 |
- [ELECTRA train code by KLUE](https://github.com/KLUE-benchmark/KLUE-ELECTRA)
|
273 |
+
|
274 |
+
|
275 |
+
# Model Card Authors [optional]
|
276 |
+
|
277 |
+
|
278 |
+
Junbum Lee in collaboration with Ezi Ozoani and the Hugging Face team
|
279 |
+
|
280 |
+
# Model Card Contact
|
281 |
+
|
282 |
+
More information needed
|
283 |
+
|
284 |
+
# How to Get Started with the Model
|
285 |
+
|
286 |
+
Use the code below to get started with the model.
|
287 |
+
|
288 |
+
<details>
|
289 |
+
<summary> Click to expand </summary>
|
290 |
+
|
291 |
+
```bash
|
292 |
+
pip install soynlp emoji
|
293 |
+
```
|
294 |
+
|
295 |
+
์๋ `clean` ํจ์๋ฅผ Text data์ ์ฌ์ฉํด์ฃผ์ธ์.
|
296 |
+
|
297 |
+
```python
|
298 |
+
import re
|
299 |
+
import emoji
|
300 |
+
from soynlp.normalizer import repeat_normalize
|
301 |
+
|
302 |
+
emojis = ''.join(emoji.UNICODE_EMOJI.keys())
|
303 |
+
pattern = re.compile(f'[^ .,?!/@$%~๏ผ
ยทโผ()\x00-\x7Fใฑ-ใ
ฃ๊ฐ-ํฃ{emojis}]+')
|
304 |
+
url_pattern = re.compile(
|
305 |
+
r'https?:\/\/(www\.)?[-a-zA-Z0-9@:%._\+~#=]{1,256}\.[a-zA-Z0-9()]{1,6}\b([-a-zA-Z0-9()@:%_\+.~#?&//=]*)')
|
306 |
+
|
307 |
+
import re
|
308 |
+
import emoji
|
309 |
+
from soynlp.normalizer import repeat_normalize
|
310 |
+
|
311 |
+
pattern = re.compile(f'[^ .,?!/@$%~๏ผ
ยทโผ()\x00-\x7Fใฑ-ใ
ฃ๊ฐ-ํฃ]+')
|
312 |
+
url_pattern = re.compile(
|
313 |
+
r'https?:\/\/(www\.)?[-a-zA-Z0-9@:%._\+~#=]{1,256}\.[a-zA-Z0-9()]{1,6}\b([-a-zA-Z0-9()@:%_\+.~#?&//=]*)')
|
314 |
+
|
315 |
+
def clean(x):
|
316 |
+
x = pattern.sub(' ', x)
|
317 |
+
x = emoji.replace_emoji(x, replace='') #emoji ์ญ์
|
318 |
+
x = url_pattern.sub('', x)
|
319 |
+
x = x.strip()
|
320 |
+
x = repeat_normalize(x, num_repeats=2)
|
321 |
+
return x
|
322 |
+
```
|
323 |
+
|
324 |
+
|
325 |
+
</details>
|
326 |
|
|
|
|
|
|
|
|