license: apache-2.0 | |
## Introduce | |
Quantizing the [NTQAI/Nxcode-CQ-7B-orpo](https://huggingface.co/NTQAI/Nxcode-CQ-7B-orpo) to f16, q2, q3, q4, q5, q6 and q8 with Llama.cpp. | |
## Prompt Template | |
``` | |
<|im_start|>system | |
{system_prompt} | |
<|im_end|> | |
<|im_start|>user | |
{prompt} | |
<|im_end|> | |
<|im_start|>assistant | |
``` | |
 |