|
--- |
|
license: other |
|
license_name: yi-license |
|
license_link: https://huggingface.co/01-ai/Yi-34B-200K/blob/main/LICENSE |
|
--- |
|
Just a test of a very high density DARE ties merge, for benchmarking on the open llm leaderboard. Config: |
|
|
|
``` |
|
models: |
|
- model: /home/alpha/Storage/Models/Raw/chargoddard_Yi-34B-200K-Llama |
|
# no parameters necessary for base model |
|
- model: /home/alpha/Storage/Models/Raw/migtissera_Tess-34B-v1.4 |
|
parameters: |
|
weight: 0.19 |
|
density: 0.83 |
|
- model: /home/alpha//Storage/Models/Raw/bhenrym14_airoboros-3_1-yi-34b-200k |
|
parameters: |
|
weight: 0.14 |
|
density: 0.6 |
|
- model: /home/alpha/Storage/Models/Raw/Nous-Capybara-34B |
|
parameters: |
|
weight: 0.19 |
|
density: 0.83 |
|
- model: /home/alpha/Storage/Models/Raw/kyujinpy_PlatYi-34B-200K-Q |
|
parameters: |
|
weight: 0.14 |
|
density: 0.6 |
|
- model: /home/alpha/FastModels/ehartford_dolphin-2.2-yi-34b-200k |
|
parameters: |
|
weight: 0.19 |
|
density: 0.83 |
|
- model: /home/alpha/FastModels/fblgit_una-xaberius-34b-v1beta |
|
parameters: |
|
weight: 0.15 |
|
density: 0.08 |
|
merge_method: dare_ties |
|
base_model: /home/alpha/Storage/Models/Raw/chargoddard_Yi-34B-200K-Llama |
|
parameters: |
|
|
|
int8_mask: true |
|
dtype: bfloat16 |
|
``` |
|
|
|
See the main model card: https://huggingface.co/brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties |