Safetensors
mpt
Krutrim
language-model
custom_code
Navanit-AI commited on
Commit
f815f04
·
verified ·
1 Parent(s): ef1e553

Chat template should be inside the tokenizer json

Browse files

Hi

@krutrim-admin
,
I have done some changes, i.e, added the chat template in tokenizer config.
If we do this then we don't have to define this in the inference code.
Kindly review this from your side, will change the readme file too if this will works okay.

Files changed (1) hide show
  1. tokenizer_config.json +1 -0
tokenizer_config.json CHANGED
@@ -1753,5 +1753,6 @@
1753
  "model_max_length": 4096,
1754
  "pad_token": "<pad>",
1755
  "tokenizer_class": "PreTrainedTokenizerFast",
 
1756
  "unk_token": "<unk>"
1757
  }
 
1753
  "model_max_length": 4096,
1754
  "pad_token": "<pad>",
1755
  "tokenizer_class": "PreTrainedTokenizerFast",
1756
+ "chat_template" :"{% for message in messages %}{% if message['role'] == 'system' %}{{ '<|SYSTEM|> ' + message['content'] + '\n' }}{% elif message['role'] == 'user' %}{{ '<|USER|> ' + message['content'] + '\n' }}{% elif message['role'] == 'assistant' %}{% if not loop.last %}{{ '<|RESPONSE|>\n' + message['content'] + eos_token + '\n' }}{% else %}{{ '<|RESPONSE|>\n' + message['content'] + eos_token }}{% endif %}{% endif %}{% if loop.last and add_generation_prompt %}{{ '<|RESPONSE|>\n' }}{% endif %}{% endfor %}"
1757
  "unk_token": "<unk>"
1758
  }