w2v-bert-2.0-aphasia_dataset / special_tokens_map.json
jackmildice's picture
Upload tokenizer
710836e verified
raw
history blame contribute delete
102 Bytes
{
"bos_token": "<s>",
"eos_token": "</s>",
"pad_token": "[PAD]",
"unk_token": "[UNK]"
}