Update tokenizer_config.json
#30
by
ZLHe0
- opened
Fix: Add chat_template
to tokenizer config
Recent versions of transformers
(v4.44+) no longer allow a default chat template.
The latest model revision removed this field, causing errors in vLLM and Hugging Face preprocessing:
ValueError: As of transformers v4.44, default chat template is no longer allowed...
Change
Restored the previous template in tokenizer_config.json
:
"chat_template": "{% set system_message = 'You are a helpful assistant.' %}{% if messages[0]['role'] == 'system' %}{% set system_message = messages[0]['content'] %}{% endif %}{% if system_message is defined %}{{ '<|im_start|>system\\n' + system_message + '<|im_end|>\\n' }}{% endif %}{% for message in messages %}{% set content = message['content'] %}{% if message['role'] == 'user' %}{{ '<|im_start|>user\\n' + content + '<|im_end|>\\n<|im_start|>assistant\\n' }}{% elif message['role'] == 'assistant' %}{{ content + '<|im_end|>' + '\\n' }}{% endif %}{% endfor %}"
Result
apply_chat_template
works again- vLLM
/chat/completions
endpoint runs without error - Maintains backward compatibility with earlier formatting
Awesome contribution! Thank you for fixing this!
xiezhe24
changed pull request status to
merged