Cosmosis-3x34B / README.md
Weyaxi's picture
Create README.md
644f202 verified
|
raw
history blame
2.47 kB
---
license: other
license_name: yi-license
license_link: https://huggingface.co/01-ai/Yi-34B-200K/blob/main/LICENSE
tags:
- yi
- moe
---
![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/6468ce47e134d050a58aa89c/jVCgVixLmOsAofXVUUgkg.jpeg)
# Cosmosis-3x34B
This is the model for Cosmosis-3x34B. I used [mergekit](https://github.com/cg123/mergekit) to make this MOE model.
# Prompt Template(s):
Since [bagel-dpo-34b-v0.2](https://huggingface.co/jondurbin/bagel-dpo-34b-v0.2) uses many prompt templates, you can utilize prompt templates provided by bagel and other expert's prompt templates.
**Note:** I currently do not know which prompt template is best.
### ChatML:
```
<|im_start|>system
{system}<|im_end|>
<|im_start|>user
{user}<|im_end|>
<|im_start|>assistant
{asistant}<|im_end|>
```
### Human Asistant
```
Human: {user}
### Assistant: {asistant}
```
### Alpaca (sort of)
```
Below is an instruction that describes a task. Write a response that appropriately completes the request.
### Instruction:
{system}
{instruction}
### Response:
```
### Vicuna
```
{system}
USER: {instruction}
ASSISTANT:
```
Visit [bagel-dpo-34b-v0.2](https://huggingface.co/jondurbin/bagel-dpo-34b-v0.2) to try more prompt templates.
# Yaml Config to reproduce
```yaml
base_model: nontoxic-bagel-34b-v0.2
gate_mode: hidden
dtype: bfloat16
experts:
- source_model: bagel-dpo-34b-v0.2
positive_prompts: ["question answering", "Q:", science", "biology", "chemistry", "physics"]
negative_prompts: ["math", "reason", "mathematics", "solve", "count", "code", "python", "javascript", "programming", "algorithm"]
- source_model: Nous-Hermes-2-Yi-34B
positive_prompts: ["chat", "math", "reason", "mathematics", "solve", "count", "python", "javascript", "programming", "algorithm", "tell me", "assistant"]
- source_model: SUS-Chat-34B
positive_prompts: ["math", "reason", "mathematics", "solve", "count", "assistant"]
```
# Quantizationed versions
Quantizationed versions of this model is available thanks to [TheBloke](https://hf.co/TheBloke).
##### GPTQ
- [TheBloke/Cosmosis-3x34B-GPTQ](https://huggingface.co/TheBloke/Cosmosis-3x34B-GPTQ)
##### GGUF
- [TheBloke/Cosmosis-3x34B-GGUF](https://huggingface.co/TheBloke/Cosmosis-3x34B-GGUF)
##### AWQ
- [TheBloke/Cosmosis-3x34B-AWQ](https://huggingface.co/TheBloke/Cosmosis-3x34B-AWQ)
If you would like to support me:
[☕ Buy Me a Coffee](https://www.buymeacoffee.com/weyaxi)