KoichiYasuoka commited on
Commit
9ff1811
·
1 Parent(s): 4b9af83
Files changed (2) hide show
  1. README.md +1 -1
  2. maker.py +1 -1
README.md CHANGED
@@ -18,7 +18,7 @@ widget:
18
 
19
  ## Model Description
20
 
21
- This is a GPT-2 model pre-trained for POS-tagging and dependency-parsing, derived from [tha_thai_1000mb](https://huggingface.co/goldfish-models/tha_thai_1000mb)refined for [Thai Universal Dependency Treebank](https://github.com/nlp-chula/TUD).
22
 
23
  ## How to Use
24
 
 
18
 
19
  ## Model Description
20
 
21
+ This is a GPT-2 model pre-trained for POS-tagging and dependency-parsing, derived from [tha_thai_1000mb](https://huggingface.co/goldfish-models/tha_thai_1000mb) refined for [Thai Universal Dependency Treebank](https://github.com/nlp-chula/TUD).
22
 
23
  ## How to Use
24
 
maker.py CHANGED
@@ -4,7 +4,7 @@ tgt="KoichiYasuoka/goldfish-gpt2-thai-ud-causal"
4
  url="https://github.com/KoichiYasuoka/spaCy-Thai"
5
 
6
  import os,json,re
7
- from transformers import AutoTokenizer,PreTrainedTokenizerFast
8
  from tokenizers import pre_tokenizers,decoders
9
  d=os.path.join(os.path.basename(url),"UD_Thai-Corpora")
10
  os.system("test -d "+d+" || git clone --depth=1 "+url)
 
4
  url="https://github.com/KoichiYasuoka/spaCy-Thai"
5
 
6
  import os,json,re
7
+ from transformers import AutoTokenizer,PreTrainedTokenizerFast,AutoConfig,GPT2ForTokenClassification,DefaultDataCollator,TrainingArguments,Trainer
8
  from tokenizers import pre_tokenizers,decoders
9
  d=os.path.join(os.path.basename(url),"UD_Thai-Corpora")
10
  os.system("test -d "+d+" || git clone --depth=1 "+url)