ltgbert-qa / tokenizer_config.json
amroadel1's picture
Upload 8 files
e2aa752 verified
raw
history blame contribute delete
199 Bytes
{
"model_max_length": 128,
"name_or_path": "/content/ltg-bert-bnc",
"special_tokens_map_file": "/content/ltg-bert-bnc/special_tokens_map.json",
"tokenizer_class": "PreTrainedTokenizerFast"
}