Prompt format

#1
by usamacogniz - opened

What is the prompt format for this model? Does it follow chatml or something else?

You can find the default template associated to the models inside the tokenizer config file:
"chat_template": "{% for message in messages %}{% if message['from'] == 'human' %}{{'<|im_start|>user\n' + message['value'] + '<|im_end|>\n'}}{% elif message['from'] == 'gpt' %}{{'<|im_start|>assistant\n' + message['value'] + '<|im_end|>\n' }}{% else %}{{ '<|im_start|>system\n' + message['value'] + '<|im_end|>\n' }}{% endif %}{% endfor %}{% if add_generation_prompt %}{{ '<|im_start|>assistant\n' }}{% endif %}"

BarraHome changed discussion status to closed
Your need to confirm your account before you can post a new comment.

Sign up or log in to comment