Add pipeline tag

#1
by nielsr HF staff - opened
Files changed (1) hide show
  1. README.md +2 -1
README.md CHANGED
@@ -7,6 +7,7 @@ license: llama3.1
7
  tags:
8
  - mergekit
9
  - merge
 
10
  ---
11
 
12
  **Typhoon2-DeepSeek-R1-70B**: Thai reasoning large language model. (research preview)
@@ -121,7 +122,7 @@ vllm serve scb10x/llama3.1-typhoon2-deepseek-r1-70b-preview --tensor-parallel-si
121
  ## **Chat template**
122
  This model is using the DeepSeek R1 70B Distill chat template, not the Llama 3 template. Be careful.
123
  ```
124
- {% if not add_generation_prompt is defined %}{% set add_generation_prompt = false %}{% endif %}{% set ns = namespace(is_first=false, is_tool=false, is_output_first=true, system_prompt='') %}{%- for message in messages %}{%- if message['role'] == 'system' %}{% set ns.system_prompt = message['content'] %}{%- endif %}{%- endfor %}{{bos_token}}{{ns.system_prompt}}{%- for message in messages %}{%- if message['role'] == 'user' %}{%- set ns.is_tool = false -%}{{'<|User|>' + message['content']}}{%- endif %}{%- if message['role'] == 'assistant' and message['content'] is none %}{%- set ns.is_tool = false -%}{%- for tool in message['tool_calls']%}{%- if not ns.is_first %}{{'<|Assistant|><|toolcallsbegin|><|toolcallbegin|>' + tool['type'] + '<|toolsep|>' + tool['function']['name'] + '\\n' + '```json' + '\\n' + tool['function']['arguments'] + '\\n' + '```' + '<|toolcallend|>'}}{%- set ns.is_first = true -%}{%- else %}{{'\\n' + '<|toolcallbegin|>' + tool['type'] + '<|toolsep|>' + tool['function']['name'] + '\\n' + '```json' + '\\n' + tool['function']['arguments'] + '\\n' + '```' + '<|toolcallend|>'}}{{'<|toolcallsend|><|endofsentence|>'}}{%- endif %}{%- endfor %}{%- endif %}{%- if message['role'] == 'assistant' and message['content'] is not none %}{%- if ns.is_tool %}{{'<|tooloutputsend|>' + message['content'] + '<|endofsentence|>'}}{%- set ns.is_tool = false -%}{%- else %}{% set content = message['content'] %}{% if '</think>' in content %}{% set content = content.split('</think>')[-1] %}{% endif %}{{'<|Assistant|>' + content + '<|endofsentence|>'}}{%- endif %}{%- endif %}{%- if message['role'] == 'tool' %}{%- set ns.is_tool = true -%}{%- if ns.is_output_first %}{{'<|tooloutputsbegin|><|tooloutputbegin|>' + message['content'] + '<|tooloutputend|>'}}{%- set ns.is_output_first = false %}{%- else %}{{'\\n<|tooloutputbegin|>' + message['content'] + '<|tooloutputend|>'}}{%- endif %}{%- endif %}{%- endfor -%}{% if ns.is_tool %}{{'<|tooloutputsend|>'}}{% endif %}{% if add_generation_prompt and not ns.is_tool %}{{'<|Assistant|><think>\\n'}}{% endif %}
125
  ```
126
 
127
  ## **Tool use**
 
7
  tags:
8
  - mergekit
9
  - merge
10
+ pipeline_tag: text-generation
11
  ---
12
 
13
  **Typhoon2-DeepSeek-R1-70B**: Thai reasoning large language model. (research preview)
 
122
  ## **Chat template**
123
  This model is using the DeepSeek R1 70B Distill chat template, not the Llama 3 template. Be careful.
124
  ```
125
+ {% if not add_generation_prompt is defined %}{% set add_generation_prompt = false %}{% endif %}{% set ns = namespace(is_first=false, is_tool=false, is_output_first=true, system_prompt='') %}{%- for message in messages %}{%- if message['role'] == 'system' %}{% set ns.system_prompt = message['content'] %}{%- endif %}{%- endfor %}{{bos_token}}{{ns.system_prompt}}{%- for message in messages %}{%- if message['role'] == 'user' %}{%- set ns.is_tool = false -%}{{'<|User|>' + message['content']}}{%- endif %}{%- if message['role'] == 'assistant' and message['content'] is none %}{%- set ns.is_tool = false -%}{%- for tool in message['tool_calls']%}{%- if not ns.is_first %}{{'<|Assistant|><|tool calls begin|><|tool call begin|>' + tool['type'] + '<|tool sep|>' + tool['function']['name'] + '\\n' + '```json' + '\\n' + tool['function']['arguments'] + '\\n' + '```' + '<|tool call end|>'}}{%- set ns.is_first = true -%}{%- else %}{{'\\n' + '<|tool call begin|>' + tool['type'] + '<|tool sep|>' + tool['function']['name'] + '\\n' + '```json' + '\\n' + tool['function']['arguments'] + '\\n' + '```' + '<|tool call end|>'}}{{'<|tool calls end|><|end of sentence|>'}}{%- endif %}{%- endfor %}{%- endif %}{%- if message['role'] == 'assistant' and message['content'] is not none %}{%- if ns.is_tool %}{{'<|tool outputs end|>' + message['content'] + '<|end of sentence|>'}}{%- set ns.is_tool = false -%}{%- else %}{% set content = message['content'] %}{% if '</think>' in content %}{% set content = content.split('</think>')[-1] %}{% endif %}{{'<|Assistant|>' + content + '<|end of sentence|>'}}{%- endif %}{%- endif %}{%- if message['role'] == 'tool' %}{%- set ns.is_tool = true -%}{%- if ns.is_output_first %}{{'<|tool outputs begin|><|tool output begin|>' + message['content'] + '<|tool output end|>'}}{%- set ns.is_output_first = false %}{%- else %}{{'\\n<|tool output begin|>' + message['content'] + '<|tool output end|>'}}{%- endif %}{%- endif %}{%- endfor -%}{% if ns.is_tool %}{{'<|tool outputs end|>'}}{% endif %}{% if add_generation_prompt and not ns.is_tool %}{{'<|Assistant|><think>\\n'}}{% endif %}
126
  ```
127
 
128
  ## **Tool use**