eval_name
stringlengths 12
111
| Precision
stringclasses 3
values | Type
stringclasses 7
values | T
stringclasses 7
values | Weight type
stringclasses 2
values | Architecture
stringclasses 64
values | Model
stringlengths 355
689
| fullname
stringlengths 4
102
| Model sha
stringlengths 0
40
| Average ⬆️
float64 0.74
52.1
| Hub License
stringclasses 27
values | Hub ❤️
int64 0
6.09k
| #Params (B)
float64 -1
141
| Available on the hub
bool 2
classes | MoE
bool 2
classes | Flagged
bool 2
classes | Chat Template
bool 2
classes | CO₂ cost (kg)
float64 0.04
187
| IFEval Raw
float64 0
0.9
| IFEval
float64 0
90
| BBH Raw
float64 0.22
0.83
| BBH
float64 0.25
76.7
| MATH Lvl 5 Raw
float64 0
0.71
| MATH Lvl 5
float64 0
71.5
| GPQA Raw
float64 0.21
0.47
| GPQA
float64 0
29.4
| MUSR Raw
float64 0.29
0.6
| MUSR
float64 0
38.7
| MMLU-PRO Raw
float64 0.1
0.73
| MMLU-PRO
float64 0
70
| Merged
bool 2
classes | Official Providers
bool 2
classes | Upload To Hub Date
stringclasses 525
values | Submission Date
stringclasses 263
values | Generation
int64 0
10
| Base Model
stringlengths 4
102
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
qingy2024_OwO-14B-Instruct_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/qingy2024/OwO-14B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2024/OwO-14B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2024__OwO-14B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
qingy2024/OwO-14B-Instruct
|
0c64ce33086d285d9374f0fb9360d52d0eb1ff92
| 29.286447 |
apache-2.0
| 0 | 14.77 | true | false | false | false | 5.629522 | 0.138312 | 13.83119 | 0.616481 | 44.948452 | 0.416163 | 41.616314 | 0.364094 | 15.212528 | 0.440687 | 13.652604 | 0.518118 | 46.457595 | false | false |
2024-12-27
|
2024-12-30
| 2 |
Qwen/Qwen2.5-14B
|
qingy2024_QwEnlarge-16B-Instruct_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/qingy2024/QwEnlarge-16B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2024/QwEnlarge-16B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2024__QwEnlarge-16B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
qingy2024/QwEnlarge-16B-Instruct
|
7b89422d7570b46b8eccd3f2cc33717bfe46bf15
| 37.700137 | 0 | 15.871 | false | false | false | true | 1.798798 | 0.780182 | 78.018214 | 0.594934 | 42.595453 | 0.45997 | 45.996979 | 0.333054 | 11.073826 | 0.410125 | 9.898958 | 0.447557 | 38.617391 | false | false |
2025-03-06
|
2025-03-06
| 1 |
qingy2024/QwEnlarge-16B-Instruct (Merge)
|
|
qingy2024_QwQ-14B-Math-v0.2_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/qingy2024/QwQ-14B-Math-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2024/QwQ-14B-Math-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2024__QwQ-14B-Math-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
qingy2024/QwQ-14B-Math-v0.2
|
308f732e0f2c1ac9e416e9c1e0523c0198ac658c
| 28.935415 |
apache-2.0
| 18 | 14.77 | true | false | false | true | 6.822343 | 0.339097 | 33.909693 | 0.573098 | 39.099214 | 0.481118 | 48.111782 | 0.262584 | 1.677852 | 0.402094 | 8.595052 | 0.47997 | 42.218898 | false | false |
2024-12-20
|
2024-12-23
| 2 |
Qwen/Qwen2.5-14B
|
qingy2024_Qwarkstar-4B_bfloat16
|
bfloat16
|
🟩 continuously pretrained
|
🟩
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/qingy2024/Qwarkstar-4B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2024/Qwarkstar-4B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2024__Qwarkstar-4B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
qingy2024/Qwarkstar-4B
|
c3dd554ec8f344e31b91b0532864388d6151700a
| 14.167331 | 0 | 4.473 | false | false | false | false | 2.234656 | 0.199412 | 19.9412 | 0.401491 | 16.574205 | 0.086103 | 8.610272 | 0.324664 | 9.955257 | 0.442833 | 14.0875 | 0.24252 | 15.83555 | false | false |
2025-01-05
|
2025-01-10
| 1 |
qingy2024/Qwarkstar-4B (Merge)
|
|
qingy2024_Qwarkstar-4B-Instruct-Preview_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/qingy2024/Qwarkstar-4B-Instruct-Preview" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2024/Qwarkstar-4B-Instruct-Preview</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2024__Qwarkstar-4B-Instruct-Preview-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
qingy2024/Qwarkstar-4B-Instruct-Preview
|
cd93b138d949e75eed3c4dba1f4dbdfe92ce255c
| 18.87301 |
apache-2.0
| 2 | 4.473 | true | false | false | true | 1.93441 | 0.532437 | 53.243727 | 0.435844 | 20.234017 | 0.128399 | 12.839879 | 0.280201 | 4.026846 | 0.389594 | 6.199219 | 0.250249 | 16.694371 | false | false |
2025-01-10
|
2025-01-17
| 1 |
qingy2024/Qwarkstar-4B-Instruct-Preview (Merge)
|
qingy2024_Qwen2.5-4B_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/qingy2024/Qwen2.5-4B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2024/Qwen2.5-4B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2024__Qwen2.5-4B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
qingy2024/Qwen2.5-4B
|
e2736ed3972e1a0b2c1d6357acec2c21369827e1
| 14.275357 | 0 | 4.168 | false | false | false | false | 1.91953 | 0.215848 | 21.584839 | 0.426938 | 19.977752 | 0.05136 | 5.135952 | 0.291107 | 5.480984 | 0.461031 | 16.528906 | 0.252493 | 16.943706 | false | false |
2025-01-03
|
2025-01-16
| 1 |
qingy2024/Qwen2.5-4B (Merge)
|
|
qingy2024_Qwen2.5-Coder-Draft-1.5B-Instruct_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/qingy2024/Qwen2.5-Coder-Draft-1.5B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2024/Qwen2.5-Coder-Draft-1.5B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2024__Qwen2.5-Coder-Draft-1.5B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
qingy2024/Qwen2.5-Coder-Draft-1.5B-Instruct
|
890f7a85cdb481969b11dd09c9bbf5bb4a97ee0a
| 14.636503 | 0 | 1.544 | false | false | false | true | 1.119952 | 0.412511 | 41.251103 | 0.38368 | 13.001066 | 0.157855 | 15.785498 | 0.260067 | 1.342282 | 0.358 | 2.616667 | 0.224402 | 13.8224 | false | false |
2025-01-31
|
2025-01-31
| 1 |
qingy2024/Qwen2.5-Coder-Draft-1.5B-Instruct (Merge)
|
|
qingy2024_Qwen2.5-Math-14B-Instruct-Alpha_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/qingy2024/Qwen2.5-Math-14B-Instruct-Alpha" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2024/Qwen2.5-Math-14B-Instruct-Alpha</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2024__Qwen2.5-Math-14B-Instruct-Alpha-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
qingy2024/Qwen2.5-Math-14B-Instruct-Alpha
|
c82727eb404d3d55450759301b80f838e4d3e1fc
| 39.352857 |
apache-2.0
| 2 | 14.77 | true | false | false | true | 3.138693 | 0.77044 | 77.044021 | 0.646486 | 50.179503 | 0.429003 | 42.900302 | 0.348993 | 13.199105 | 0.402094 | 8.728385 | 0.496592 | 44.065824 | false | false |
2024-12-03
|
2024-12-10
| 2 |
Qwen/Qwen2.5-14B
|
qingy2024_Qwen2.5-Math-14B-Instruct-Preview_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/qingy2024/Qwen2.5-Math-14B-Instruct-Preview" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2024/Qwen2.5-Math-14B-Instruct-Preview</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2024__Qwen2.5-Math-14B-Instruct-Preview-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
qingy2024/Qwen2.5-Math-14B-Instruct-Preview
|
7b9e9b94d69f0de9627f728e9328fb394f7fea14
| 39.918107 |
apache-2.0
| 1 | 14.77 | true | false | false | true | 3.237961 | 0.78258 | 78.258022 | 0.629394 | 47.050808 | 0.475831 | 47.583082 | 0.340604 | 12.080537 | 0.411458 | 10.165625 | 0.499335 | 44.370567 | false | false |
2024-12-01
|
2024-12-10
| 3 |
Qwen/Qwen2.5-14B
|
qingy2024_Qwen2.6-14B-Instruct_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/qingy2024/Qwen2.6-14B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2024/Qwen2.6-14B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2024__Qwen2.6-14B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
qingy2024/Qwen2.6-14B-Instruct
|
c21acf3c074e9522c5d0559ccc4ed715c48b8eff
| 36.254385 | 1 | 14.766 | false | false | false | false | 3.578572 | 0.581097 | 58.109704 | 0.639414 | 48.047948 | 0.305136 | 30.513595 | 0.379195 | 17.225951 | 0.456938 | 16.017188 | 0.528507 | 47.611924 | false | false |
2024-12-04
|
2024-12-04
| 1 |
qingy2024/Qwen2.6-14B-Instruct (Merge)
|
|
qingy2024_Qwen2.6-Math-14B-Instruct_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/qingy2024/Qwen2.6-Math-14B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2024/Qwen2.6-Math-14B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2024__Qwen2.6-Math-14B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
qingy2024/Qwen2.6-Math-14B-Instruct
|
45bb3f302922fbf185694bba2748a32ca3313a5e
| 35.196454 |
apache-2.0
| 1 | 14 | true | false | false | false | 3.112143 | 0.386232 | 38.623186 | 0.632444 | 47.022117 | 0.429003 | 42.900302 | 0.369966 | 15.995526 | 0.475854 | 19.515104 | 0.524102 | 47.122488 | false | false |
2024-12-04
|
2024-12-04
| 3 |
Qwen/Qwen2.5-14B
|
qq8933_OpenLongCoT-Base-Gemma2-2B_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/qq8933/OpenLongCoT-Base-Gemma2-2B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qq8933/OpenLongCoT-Base-Gemma2-2B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qq8933__OpenLongCoT-Base-Gemma2-2B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
qq8933/OpenLongCoT-Base-Gemma2-2B
|
39e5bc941f107ac28142c802aecfd257cc47c1bb
| 5.473142 |
other
| 8 | 3.204 | true | false | false | true | 3.316973 | 0.196514 | 19.651414 | 0.310636 | 3.546298 | 0.023414 | 2.34139 | 0.262584 | 1.677852 | 0.32225 | 2.114583 | 0.131566 | 3.507314 | false | false |
2024-10-28
|
2024-11-12
| 2 |
google/gemma-2-2b
|
raphgg_test-2.5-72B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/raphgg/test-2.5-72B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">raphgg/test-2.5-72B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/raphgg__test-2.5-72B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
raphgg/test-2.5-72B
|
0f34d627ccd451c5bd74f495bcdb8b18787d6f3b
| 46.739878 |
apache-2.0
| 0 | 72.706 | true | false | false | true | 44.864705 | 0.843705 | 84.37047 | 0.72661 | 62.154127 | 0.410876 | 41.087613 | 0.389262 | 18.568233 | 0.481188 | 20.515104 | 0.583693 | 53.74372 | false | false |
2023-07-27
|
2024-12-27
| 0 |
raphgg/test-2.5-72B
|
rasyosef_Mistral-NeMo-Minitron-8B-Chat_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/rasyosef/Mistral-NeMo-Minitron-8B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rasyosef/Mistral-NeMo-Minitron-8B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rasyosef__Mistral-NeMo-Minitron-8B-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
rasyosef/Mistral-NeMo-Minitron-8B-Chat
|
cede47eac8a4e65aa27567d3f087c28185b537d9
| 17.545649 |
other
| 9 | 8.414 | true | false | false | true | 2.952796 | 0.445184 | 44.518433 | 0.475944 | 26.036695 | 0.02719 | 2.719033 | 0.276007 | 3.467562 | 0.430427 | 12.936719 | 0.240359 | 15.595449 | false | false |
2024-08-26
|
2024-08-26
| 1 |
nvidia/Mistral-NeMo-Minitron-8B-Base
|
rasyosef_Phi-1_5-Instruct-v0.1_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
PhiForCausalLM
|
<a target="_blank" href="https://huggingface.co/rasyosef/Phi-1_5-Instruct-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rasyosef/Phi-1_5-Instruct-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rasyosef__Phi-1_5-Instruct-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
rasyosef/Phi-1_5-Instruct-v0.1
|
f4c405ee4bff5dc1a69383f3fe682342c9c87c77
| 6.864748 |
mit
| 1 | 1.415 | true | false | false | true | 0.590044 | 0.240228 | 24.022815 | 0.31179 | 4.820244 | 0.013595 | 1.359517 | 0.260067 | 1.342282 | 0.342156 | 3.402865 | 0.156167 | 6.240765 | false | false |
2024-07-24
|
2024-07-25
| 1 |
microsoft/phi-1_5
|
rasyosef_phi-2-instruct-apo_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
PhiForCausalLM
|
<a target="_blank" href="https://huggingface.co/rasyosef/phi-2-instruct-apo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rasyosef/phi-2-instruct-apo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rasyosef__phi-2-instruct-apo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
rasyosef/phi-2-instruct-apo
|
2d3722d6db77a8c844a50dd32ddc4278fdc89e1f
| 12.547053 |
mit
| 0 | 2.775 | true | false | false | true | 0.99013 | 0.314592 | 31.459195 | 0.44451 | 21.672438 | 0.030211 | 3.021148 | 0.270134 | 2.684564 | 0.334219 | 3.610677 | 0.215509 | 12.834294 | false | false |
2024-09-15
|
2024-09-17
| 1 |
microsoft/phi-2
|
rasyosef_phi-2-instruct-v0.1_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
PhiForCausalLM
|
<a target="_blank" href="https://huggingface.co/rasyosef/phi-2-instruct-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rasyosef/phi-2-instruct-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rasyosef__phi-2-instruct-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
rasyosef/phi-2-instruct-v0.1
|
29aeb3ccf7c79e0169a038fbd0deaf9772a9fefd
| 14.218631 |
mit
| 2 | 2.775 | true | false | false | true | 0.492726 | 0.368148 | 36.814763 | 0.472612 | 26.358802 | 0 | 0 | 0.274329 | 3.243848 | 0.352354 | 5.044271 | 0.224651 | 13.850103 | false | false |
2024-08-09
|
2024-08-10
| 1 |
microsoft/phi-2
|
realtreetune_rho-1b-sft-MATH_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/realtreetune/rho-1b-sft-MATH" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">realtreetune/rho-1b-sft-MATH</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/realtreetune__rho-1b-sft-MATH-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
realtreetune/rho-1b-sft-MATH
|
b5f93df6af679a860caac9a9598e0f70c326b4fb
| 5.569175 | 0 | 1.1 | false | false | false | false | 0.556268 | 0.212102 | 21.210167 | 0.314415 | 4.197623 | 0.034743 | 3.47432 | 0.252517 | 0.33557 | 0.345844 | 2.897135 | 0.111702 | 1.300236 | false | false |
2024-06-06
|
2024-10-05
| 1 |
realtreetune/rho-1b-sft-MATH (Merge)
|
|
recoilme_Gemma-2-Ataraxy-Gemmasutra-9B-slerp_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/recoilme/Gemma-2-Ataraxy-Gemmasutra-9B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">recoilme/Gemma-2-Ataraxy-Gemmasutra-9B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/recoilme__Gemma-2-Ataraxy-Gemmasutra-9B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
recoilme/Gemma-2-Ataraxy-Gemmasutra-9B-slerp
|
9048af8616bc62b6efab2bc1bc77ba53c5dfed79
| 29.873992 |
apache-2.0
| 4 | 10.159 | true | false | false | true | 2.114373 | 0.764895 | 76.489492 | 0.597439 | 42.25121 | 0.017372 | 1.73716 | 0.330537 | 10.738255 | 0.424479 | 12.393229 | 0.420711 | 35.634604 | false | false |
2024-09-11
|
2024-09-12
| 0 |
recoilme/Gemma-2-Ataraxy-Gemmasutra-9B-slerp
|
recoilme_Gemma-2-Ataraxy-Gemmasutra-9B-slerp_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/recoilme/Gemma-2-Ataraxy-Gemmasutra-9B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">recoilme/Gemma-2-Ataraxy-Gemmasutra-9B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/recoilme__Gemma-2-Ataraxy-Gemmasutra-9B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
recoilme/Gemma-2-Ataraxy-Gemmasutra-9B-slerp
|
5a4f7299d9f8ea5faad2b1edc68b7bf634dac40b
| 23.910553 |
apache-2.0
| 4 | 10.159 | true | false | false | false | 5.939655 | 0.285365 | 28.536505 | 0.598393 | 42.703798 | 0.100453 | 10.045317 | 0.329698 | 10.626398 | 0.460656 | 16.415365 | 0.416223 | 35.135934 | false | false |
2024-09-11
|
2024-09-27
| 0 |
recoilme/Gemma-2-Ataraxy-Gemmasutra-9B-slerp
|
recoilme_recoilme-gemma-2-9B-v0.1_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/recoilme/recoilme-gemma-2-9B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">recoilme/recoilme-gemma-2-9B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/recoilme__recoilme-gemma-2-9B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
recoilme/recoilme-gemma-2-9B-v0.1
|
6dc0997046db4e9932f87d338ecdc2a4158abbda
| 32.724599 | 0 | 10.159 | false | false | false | true | 3.849617 | 0.751506 | 75.1506 | 0.599531 | 42.321861 | 0.203927 | 20.392749 | 0.338926 | 11.856823 | 0.419146 | 11.526563 | 0.415891 | 35.098995 | false | false |
2024-09-18
| 0 |
Removed
|
||
recoilme_recoilme-gemma-2-9B-v0.2_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/recoilme/recoilme-gemma-2-9B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">recoilme/recoilme-gemma-2-9B-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/recoilme__recoilme-gemma-2-9B-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
recoilme/recoilme-gemma-2-9B-v0.2
|
483116e575fb3a56de25243b14d715c58fe127bc
| 30.048864 |
cc-by-nc-4.0
| 2 | 10.159 | true | false | false | true | 1.914086 | 0.759175 | 75.917455 | 0.602596 | 43.027969 | 0.05287 | 5.287009 | 0.328859 | 10.514541 | 0.409875 | 10.401042 | 0.416307 | 35.145168 | false | false |
2024-09-18
|
2024-09-18
| 0 |
recoilme/recoilme-gemma-2-9B-v0.2
|
recoilme_recoilme-gemma-2-9B-v0.2_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/recoilme/recoilme-gemma-2-9B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">recoilme/recoilme-gemma-2-9B-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/recoilme__recoilme-gemma-2-9B-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
recoilme/recoilme-gemma-2-9B-v0.2
|
483116e575fb3a56de25243b14d715c58fe127bc
| 23.762851 |
cc-by-nc-4.0
| 2 | 10.159 | true | false | false | false | 5.893569 | 0.274699 | 27.469891 | 0.603083 | 43.560581 | 0.083082 | 8.308157 | 0.330537 | 10.738255 | 0.468594 | 17.807552 | 0.412234 | 34.692671 | false | false |
2024-09-18
|
2024-09-27
| 0 |
recoilme/recoilme-gemma-2-9B-v0.2
|
recoilme_recoilme-gemma-2-9B-v0.3_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/recoilme/recoilme-gemma-2-9B-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">recoilme/recoilme-gemma-2-9B-v0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/recoilme__recoilme-gemma-2-9B-v0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
recoilme/recoilme-gemma-2-9B-v0.3
|
772cab46d9d22cbcc3c574d193021803ce5c444c
| 30.207472 |
cc-by-nc-4.0
| 3 | 10.159 | true | false | false | true | 1.876637 | 0.743937 | 74.39372 | 0.599253 | 42.026279 | 0.087613 | 8.761329 | 0.323826 | 9.8434 | 0.420385 | 12.08151 | 0.407247 | 34.138593 | false | false |
2024-09-18
|
2024-09-18
| 0 |
recoilme/recoilme-gemma-2-9B-v0.3
|
recoilme_recoilme-gemma-2-9B-v0.3_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/recoilme/recoilme-gemma-2-9B-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">recoilme/recoilme-gemma-2-9B-v0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/recoilme__recoilme-gemma-2-9B-v0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
recoilme/recoilme-gemma-2-9B-v0.3
|
76c8fb761660e6eb237c91bb6e6761ee36266bba
| 30.375989 |
cc-by-nc-4.0
| 3 | 10.159 | true | false | false | false | 5.110699 | 0.576076 | 57.607592 | 0.601983 | 43.326868 | 0.188822 | 18.882175 | 0.337248 | 11.63311 | 0.463229 | 17.036979 | 0.403923 | 33.769208 | false | false |
2024-09-18
|
2024-09-27
| 0 |
recoilme/recoilme-gemma-2-9B-v0.3
|
recoilme_recoilme-gemma-2-9B-v0.4_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/recoilme/recoilme-gemma-2-9B-v0.4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">recoilme/recoilme-gemma-2-9B-v0.4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/recoilme__recoilme-gemma-2-9B-v0.4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
recoilme/recoilme-gemma-2-9B-v0.4
|
2691f2cc8d80072f15d78cb7ae72831e1a12139e
| 24.138128 |
cc-by-nc-4.0
| 5 | 10.159 | true | false | false | false | 5.83782 | 0.256189 | 25.618913 | 0.596729 | 42.442482 | 0.084592 | 8.459215 | 0.340604 | 12.080537 | 0.472688 | 18.385938 | 0.440575 | 37.841681 | false | false |
2024-09-18
|
2024-09-19
| 0 |
recoilme/recoilme-gemma-2-9B-v0.4
|
recoilme_recoilme-gemma-2-9B-v0.5_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/recoilme/recoilme-gemma-2-9B-v0.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">recoilme/recoilme-gemma-2-9B-v0.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/recoilme__recoilme-gemma-2-9B-v0.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
recoilme/recoilme-gemma-2-9B-v0.5
|
b4035d3a16486dae4f726eb953be959a4573ea67
| 33.229967 | 0 | 10.159 | false | false | false | true | 5.791467 | 0.766419 | 76.641866 | 0.598147 | 42.353355 | 0.21148 | 21.148036 | 0.336409 | 11.521253 | 0.423177 | 12.163802 | 0.419963 | 35.551492 | false | false |
2024-11-26
|
2024-11-26
| 1 |
recoilme/recoilme-gemma-2-9B-v0.5 (Merge)
|
|
redrix_AngelSlayer-12B-Unslop-Mell-RPMax-DARKNESS_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/redrix/AngelSlayer-12B-Unslop-Mell-RPMax-DARKNESS" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">redrix/AngelSlayer-12B-Unslop-Mell-RPMax-DARKNESS</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/redrix__AngelSlayer-12B-Unslop-Mell-RPMax-DARKNESS-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
redrix/AngelSlayer-12B-Unslop-Mell-RPMax-DARKNESS
|
1523f26adec368380647e864dd2e9fa79f36fefe
| 22.776538 |
apache-2.0
| 13 | 12.248 | true | false | false | true | 2.191697 | 0.535959 | 53.595903 | 0.512884 | 29.965932 | 0.113293 | 11.329305 | 0.315436 | 8.724832 | 0.381781 | 8.822656 | 0.317985 | 24.220597 | true | false |
2024-12-05
|
2024-12-12
| 1 |
redrix/AngelSlayer-12B-Unslop-Mell-RPMax-DARKNESS (Merge)
|
redrix_patricide-12B-Unslop-Mell_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/redrix/patricide-12B-Unslop-Mell" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">redrix/patricide-12B-Unslop-Mell</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/redrix__patricide-12B-Unslop-Mell-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
redrix/patricide-12B-Unslop-Mell
|
2f1a849859a24da80bd1f938a2ac6ab627ef75e8
| 23.021831 |
apache-2.0
| 16 | 12.248 | true | false | false | false | 2.05911 | 0.40739 | 40.739017 | 0.539867 | 33.989448 | 0.13142 | 13.141994 | 0.323826 | 9.8434 | 0.402583 | 11.85625 | 0.357048 | 28.560875 | true | false |
2024-12-01
|
2024-12-12
| 1 |
redrix/patricide-12B-Unslop-Mell (Merge)
|
refuelai_Llama-3-Refueled_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/refuelai/Llama-3-Refueled" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">refuelai/Llama-3-Refueled</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/refuelai__Llama-3-Refueled-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
refuelai/Llama-3-Refueled
|
ff6d1c3ba37b31d4af421951c2300f2256fb3691
| 23.181448 |
cc-by-nc-4.0
| 192 | 8.03 | true | false | false | true | 1.751972 | 0.461995 | 46.199528 | 0.587077 | 41.721971 | 0.066465 | 6.646526 | 0.299497 | 6.599553 | 0.445406 | 14.642448 | 0.309508 | 23.278664 | false | true |
2024-05-03
|
2024-06-12
| 0 |
refuelai/Llama-3-Refueled
|
rhplus0831_maid-yuzu-v7_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MixtralForCausalLM
|
<a target="_blank" href="https://huggingface.co/rhplus0831/maid-yuzu-v7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rhplus0831/maid-yuzu-v7</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rhplus0831__maid-yuzu-v7-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
rhplus0831/maid-yuzu-v7
|
a0bd8c707bb80024778da4a0d057917faa53d2f6
| 24.595223 | 1 | 46.703 | false | false | false | true | 8.208571 | 0.646243 | 64.624308 | 0.480492 | 26.819837 | 0.101964 | 10.196375 | 0.309564 | 7.941834 | 0.413625 | 9.769792 | 0.353973 | 28.219193 | false | false |
2024-02-09
|
2024-09-08
| 1 |
rhplus0831/maid-yuzu-v7 (Merge)
|
|
rhymes-ai_Aria_bfloat16
|
bfloat16
|
🌸 multimodal
|
🌸
|
Original
|
AriaForConditionalGeneration
|
<a target="_blank" href="https://huggingface.co/rhymes-ai/Aria" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rhymes-ai/Aria</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rhymes-ai__Aria-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
rhymes-ai/Aria
|
5cc2703b3afd585f232ec5027e9c039a2001bcec
| 28.870164 |
apache-2.0
| 623 | 25.307 | true | true | false | true | 15.501419 | 0.477308 | 47.730799 | 0.569531 | 39.281493 | 0.193353 | 19.335347 | 0.362416 | 14.988814 | 0.43375 | 14.052083 | 0.440492 | 37.832447 | false | true |
2024-09-26
|
2024-10-10
| 1 |
rhymes-ai/Aria (Merge)
|
rhysjones_phi-2-orange-v2_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
PhiForCausalLM
|
<a target="_blank" href="https://huggingface.co/rhysjones/phi-2-orange-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rhysjones/phi-2-orange-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rhysjones__phi-2-orange-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
rhysjones/phi-2-orange-v2
|
f4085189114accfb65225deb8fbdf15767b7ee56
| 15.324185 |
mit
| 26 | 2.78 | true | false | false | true | 0.941899 | 0.366974 | 36.697407 | 0.477022 | 25.606549 | 0.040785 | 4.07855 | 0.261745 | 1.565996 | 0.362958 | 6.969792 | 0.253241 | 17.026817 | false | false |
2024-03-04
|
2024-06-28
| 0 |
rhysjones/phi-2-orange-v2
|
riaz_FineLlama-3.1-8B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/riaz/FineLlama-3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">riaz/FineLlama-3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/riaz__FineLlama-3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
riaz/FineLlama-3.1-8B
|
c4d8f16eb446910edce0c1afd0e6d5f3b06e2e7d
| 17.660648 |
apache-2.0
| 1 | 8.03 | true | false | false | true | 1.842183 | 0.437341 | 43.73407 | 0.458573 | 24.148778 | 0.05136 | 5.135952 | 0.275168 | 3.355705 | 0.376292 | 7.769792 | 0.296376 | 21.819592 | false | false |
2024-10-07
|
2024-10-12
| 2 |
meta-llama/Meta-Llama-3.1-8B
|
riaz_FineLlama-3.1-8B_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/riaz/FineLlama-3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">riaz/FineLlama-3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/riaz__FineLlama-3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
riaz/FineLlama-3.1-8B
|
c4d8f16eb446910edce0c1afd0e6d5f3b06e2e7d
| 17.147511 |
apache-2.0
| 1 | 8.03 | true | false | false | true | 0.901998 | 0.41366 | 41.36602 | 0.456452 | 23.77339 | 0.045317 | 4.531722 | 0.276007 | 3.467562 | 0.377625 | 7.769792 | 0.297789 | 21.976581 | false | false |
2024-10-07
|
2024-10-12
| 2 |
meta-llama/Meta-Llama-3.1-8B
|
rmdhirr_Gluon-8B_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/rmdhirr/Gluon-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rmdhirr/Gluon-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rmdhirr__Gluon-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
rmdhirr/Gluon-8B
|
cc949908c60ab7f696e133714222d6cab156e493
| 23.976963 |
apache-2.0
| 1 | 8.03 | true | false | false | false | 1.806157 | 0.505285 | 50.528487 | 0.515331 | 30.342247 | 0.14426 | 14.425982 | 0.312081 | 8.277405 | 0.403885 | 9.085677 | 0.380818 | 31.20198 | true | false |
2024-09-14
|
2024-09-14
| 1 |
rmdhirr/Gluon-8B (Merge)
|
rombodawg_Rombos-Coder-V2.5-Qwen-14b_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/rombodawg/Rombos-Coder-V2.5-Qwen-14b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rombodawg/Rombos-Coder-V2.5-Qwen-14b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rombodawg__Rombos-Coder-V2.5-Qwen-14b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
rombodawg/Rombos-Coder-V2.5-Qwen-14b
|
00147618f151b8973b4b25f18281625105482af9
| 32.445608 |
apache-2.0
| 6 | 14.77 | true | false | false | true | 2.702803 | 0.704745 | 70.474452 | 0.616514 | 44.519499 | 0.33006 | 33.006042 | 0.302852 | 7.04698 | 0.391458 | 6.965625 | 0.393949 | 32.661052 | false | false |
2024-11-12
|
2025-02-06
| 1 |
rombodawg/Rombos-Coder-V2.5-Qwen-14b (Merge)
|
rombodawg_Rombos-Coder-V2.5-Qwen-7b_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/rombodawg/Rombos-Coder-V2.5-Qwen-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rombodawg/Rombos-Coder-V2.5-Qwen-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rombodawg__Rombos-Coder-V2.5-Qwen-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
rombodawg/Rombos-Coder-V2.5-Qwen-7b
|
896a040ead29dd6352ef7fadbf2451ce72baeca9
| 27.405415 |
apache-2.0
| 5 | 7.616 | true | false | false | true | 1.243923 | 0.621039 | 62.103884 | 0.507709 | 30.22172 | 0.333837 | 33.383686 | 0.283557 | 4.474273 | 0.397938 | 7.608854 | 0.339761 | 26.640071 | false | false |
2024-10-28
|
2025-02-06
| 1 |
rombodawg/Rombos-Coder-V2.5-Qwen-7b (Merge)
|
rombodawg_Rombos-LLM-V2.5-Qwen-0.5b_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/rombodawg/Rombos-LLM-V2.5-Qwen-0.5b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rombodawg/Rombos-LLM-V2.5-Qwen-0.5b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rombodawg__Rombos-LLM-V2.5-Qwen-0.5b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
rombodawg/Rombos-LLM-V2.5-Qwen-0.5b
|
aae2e55548c8090ce357c64ca78e8b9ef6baf118
| 9.38592 |
apache-2.0
| 4 | 0.63 | true | false | false | false | 1.291414 | 0.284667 | 28.466691 | 0.329368 | 8.412219 | 0.067976 | 6.797583 | 0.266779 | 2.237136 | 0.323583 | 0.78125 | 0.186586 | 9.620641 | false | false |
2024-10-06
|
2024-09-29
| 1 |
rombodawg/Rombos-LLM-V2.5-Qwen-0.5b (Merge)
|
rombodawg_Rombos-LLM-V2.5-Qwen-1.5b_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/rombodawg/Rombos-LLM-V2.5-Qwen-1.5b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rombodawg/Rombos-LLM-V2.5-Qwen-1.5b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rombodawg__Rombos-LLM-V2.5-Qwen-1.5b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
rombodawg/Rombos-LLM-V2.5-Qwen-1.5b
|
1f634da015ed671efe7dc574bc2a1954f5b2cc93
| 16.354386 |
apache-2.0
| 4 | 1.777 | true | false | false | false | 1.480716 | 0.340246 | 34.02461 | 0.42567 | 18.711344 | 0.085347 | 8.534743 | 0.288591 | 5.145414 | 0.418552 | 10.352344 | 0.292221 | 21.357861 | false | false |
2024-10-06
|
2024-09-29
| 1 |
rombodawg/Rombos-LLM-V2.5-Qwen-1.5b (Merge)
|
rombodawg_Rombos-LLM-V2.5-Qwen-14b_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/rombodawg/Rombos-LLM-V2.5-Qwen-14b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rombodawg/Rombos-LLM-V2.5-Qwen-14b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rombodawg__Rombos-LLM-V2.5-Qwen-14b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
rombodawg/Rombos-LLM-V2.5-Qwen-14b
|
834ddb1712ae6d1b232b2d5b26be658d90d23e43
| 39.500956 |
apache-2.0
| 10 | 14.77 | true | false | false | false | 4.3654 | 0.584045 | 58.404478 | 0.648109 | 49.3869 | 0.455438 | 45.543807 | 0.371644 | 16.219239 | 0.471729 | 18.832812 | 0.537566 | 48.618499 | false | false |
2024-10-06
|
2024-09-29
| 1 |
rombodawg/Rombos-LLM-V2.5-Qwen-14b (Merge)
|
rombodawg_Rombos-LLM-V2.5-Qwen-32b_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/rombodawg/Rombos-LLM-V2.5-Qwen-32b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rombodawg/Rombos-LLM-V2.5-Qwen-32b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rombodawg__Rombos-LLM-V2.5-Qwen-32b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
rombodawg/Rombos-LLM-V2.5-Qwen-32b
|
234abe4b494dbe83ba805b791f74feb33462a33d
| 45.833012 |
apache-2.0
| 61 | 32.764 | true | false | false | false | 35.825379 | 0.682663 | 68.266311 | 0.704554 | 58.261894 | 0.495468 | 49.546828 | 0.396812 | 19.574944 | 0.503417 | 24.727083 | 0.591589 | 54.621011 | false | false |
2024-09-30
|
2024-10-07
| 1 |
rombodawg/Rombos-LLM-V2.5-Qwen-32b (Merge)
|
rombodawg_Rombos-LLM-V2.5-Qwen-3b_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/rombodawg/Rombos-LLM-V2.5-Qwen-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rombodawg/Rombos-LLM-V2.5-Qwen-3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rombodawg__Rombos-LLM-V2.5-Qwen-3b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
rombodawg/Rombos-LLM-V2.5-Qwen-3b
|
26601a8da5afce3b5959d91bdd0faaab6df8bf95
| 25.921782 |
other
| 5 | 3.397 | true | false | false | false | 2.011589 | 0.534236 | 53.423583 | 0.48089 | 27.213597 | 0.279456 | 27.945619 | 0.307886 | 7.718121 | 0.404167 | 8.554167 | 0.37608 | 30.675606 | false | false |
2024-10-06
|
2024-09-29
| 1 |
rombodawg/Rombos-LLM-V2.5-Qwen-3b (Merge)
|
rombodawg_Rombos-LLM-V2.5-Qwen-72b_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/rombodawg/Rombos-LLM-V2.5-Qwen-72b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rombodawg/Rombos-LLM-V2.5-Qwen-72b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rombodawg__Rombos-LLM-V2.5-Qwen-72b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
rombodawg/Rombos-LLM-V2.5-Qwen-72b
|
5260f182e7859e13d515c4cb3926ac85ad057504
| 46.500887 |
other
| 37 | 72.706 | true | false | false | true | 32.067891 | 0.715536 | 71.553589 | 0.722959 | 61.267145 | 0.542296 | 54.229607 | 0.39849 | 19.798658 | 0.459917 | 17.322917 | 0.593501 | 54.833407 | false | false |
2024-09-30
|
2024-12-19
| 1 |
rombodawg/Rombos-LLM-V2.5-Qwen-72b (Merge)
|
rombodawg_Rombos-LLM-V2.5-Qwen-7b_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/rombodawg/Rombos-LLM-V2.5-Qwen-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rombodawg/Rombos-LLM-V2.5-Qwen-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rombodawg__Rombos-LLM-V2.5-Qwen-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
rombodawg/Rombos-LLM-V2.5-Qwen-7b
|
dbd819e8f765181f774cb5b79812d081669eb302
| 32.748804 |
apache-2.0
| 17 | 7.616 | true | false | false | false | 2.634168 | 0.623712 | 62.371175 | 0.554389 | 36.37235 | 0.38142 | 38.141994 | 0.317953 | 9.060403 | 0.429094 | 12.003385 | 0.446892 | 38.543514 | false | false |
2024-10-06
|
2024-09-29
| 1 |
rombodawg/Rombos-LLM-V2.5-Qwen-7b (Merge)
|
rombodawg_Rombos-LLM-V2.5.1-Qwen-3b_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/rombodawg/Rombos-LLM-V2.5.1-Qwen-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rombodawg/Rombos-LLM-V2.5.1-Qwen-3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rombodawg__Rombos-LLM-V2.5.1-Qwen-3b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
rombodawg/Rombos-LLM-V2.5.1-Qwen-3b
|
a3305ce148f4273ab334052ab47d3aebb51d104c
| 13.357125 |
other
| 1 | 3.397 | true | false | false | false | 0.929244 | 0.259513 | 25.951254 | 0.388404 | 14.881409 | 0.09139 | 9.138973 | 0.274329 | 3.243848 | 0.399115 | 7.822656 | 0.271941 | 19.10461 | true | false |
2024-10-08
|
2024-10-08
| 1 |
rombodawg/Rombos-LLM-V2.5.1-Qwen-3b (Merge)
|
rombodawg_Rombos-LLM-V2.5.1-Qwen-3b_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/rombodawg/Rombos-LLM-V2.5.1-Qwen-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rombodawg/Rombos-LLM-V2.5.1-Qwen-3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rombodawg__Rombos-LLM-V2.5.1-Qwen-3b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
rombodawg/Rombos-LLM-V2.5.1-Qwen-3b
|
b65848c13b31f5b9d5d953df95d504d195082a3b
| 13.608595 |
other
| 1 | 3.397 | true | false | false | false | 2.593723 | 0.25664 | 25.664016 | 0.390008 | 15.057744 | 0.120846 | 12.084592 | 0.262584 | 1.677852 | 0.399115 | 7.822656 | 0.274102 | 19.34471 | true | false |
2024-10-08
|
2024-11-14
| 1 |
rombodawg/Rombos-LLM-V2.5.1-Qwen-3b (Merge)
|
rombodawg_Rombos-LLM-V2.6-Nemotron-70b_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/rombodawg/Rombos-LLM-V2.6-Nemotron-70b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rombodawg/Rombos-LLM-V2.6-Nemotron-70b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rombodawg__Rombos-LLM-V2.6-Nemotron-70b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
rombodawg/Rombos-LLM-V2.6-Nemotron-70b
|
951c9cdf68d6e679c78625d1a1f396eb71cdf746
| 41.94623 |
llama3.1
| 3 | 70.554 | true | false | false | false | 23.901549 | 0.752655 | 75.265518 | 0.69377 | 55.805573 | 0.333082 | 33.308157 | 0.40604 | 20.805369 | 0.466865 | 18.391406 | 0.532912 | 48.101359 | false | false |
2024-10-17
|
2024-10-17
| 0 |
rombodawg/Rombos-LLM-V2.6-Nemotron-70b
|
rombodawg_Rombos-LLM-V2.6-Qwen-14b_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/rombodawg/Rombos-LLM-V2.6-Qwen-14b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rombodawg/Rombos-LLM-V2.6-Qwen-14b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rombodawg__Rombos-LLM-V2.6-Qwen-14b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
rombodawg/Rombos-LLM-V2.6-Qwen-14b
|
887910d75a1837b8b8c7c3e50a257517d286ec60
| 42.199345 |
apache-2.0
| 53 | 14.77 | true | false | false | true | 5.251571 | 0.843155 | 84.315505 | 0.64421 | 49.278518 | 0.521148 | 52.114804 | 0.333893 | 11.185682 | 0.422063 | 12.291146 | 0.496094 | 44.010417 | false | false |
2024-10-12
|
2024-12-07
| 1 |
rombodawg/Rombos-LLM-V2.6-Qwen-14b (Merge)
|
rombodawg_rombos_Replete-Coder-Instruct-8b-Merged_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/rombodawg/rombos_Replete-Coder-Instruct-8b-Merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rombodawg/rombos_Replete-Coder-Instruct-8b-Merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rombodawg__rombos_Replete-Coder-Instruct-8b-Merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
rombodawg/rombos_Replete-Coder-Instruct-8b-Merged
|
85ad1fb943d73866ba5c8dcfe4a4f2cbfba12d4d
| 16.433824 |
apache-2.0
| 2 | 8.03 | true | false | false | true | 1.928256 | 0.538757 | 53.875716 | 0.446169 | 21.937707 | 0.077795 | 7.779456 | 0.269295 | 2.572707 | 0.366031 | 3.453906 | 0.180851 | 8.983452 | false | false |
2024-10-06
|
2024-10-14
| 0 |
rombodawg/rombos_Replete-Coder-Instruct-8b-Merged
|
rombodawg_rombos_Replete-Coder-Llama3-8B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/rombodawg/rombos_Replete-Coder-Llama3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rombodawg/rombos_Replete-Coder-Llama3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rombodawg__rombos_Replete-Coder-Llama3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
rombodawg/rombos_Replete-Coder-Llama3-8B
|
938a45789cf94821ef6b12c98dc76622a0fa936a
| 11.971033 |
other
| 3 | 8.03 | true | false | false | true | 2.411204 | 0.471413 | 47.141252 | 0.327628 | 7.087845 | 0.039275 | 3.927492 | 0.266779 | 2.237136 | 0.396635 | 7.71276 | 0.133477 | 3.71971 | false | false |
2024-10-06
|
2024-10-14
| 0 |
rombodawg/rombos_Replete-Coder-Llama3-8B
|
rootxhacker_Apollo-70B_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/rootxhacker/Apollo-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rootxhacker/Apollo-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rootxhacker__Apollo-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
rootxhacker/Apollo-70B
|
dea3d818bfdab718a2313e3ca023e54e3f4d9a3c
| 43.159018 |
mit
| 0 | 70.554 | true | false | false | false | 15.709605 | 0.509856 | 50.985607 | 0.680422 | 53.528405 | 0.561178 | 56.117825 | 0.457215 | 27.628635 | 0.494771 | 23.146354 | 0.527926 | 47.547281 | true | false |
2025-03-02
|
2025-03-02
| 1 |
rootxhacker/Apollo-70B (Merge)
|
rootxhacker_Apollo_v2-32B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/rootxhacker/Apollo_v2-32B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rootxhacker/Apollo_v2-32B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rootxhacker__Apollo_v2-32B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
rootxhacker/Apollo_v2-32B
|
2ce67ae5de87c736b78110d0e0219f4943406043
| 39.811701 |
mit
| 0 | 32.764 | true | false | false | true | 62.078329 | 0.428049 | 42.804869 | 0.707227 | 58.274951 | 0.427492 | 42.749245 | 0.378356 | 17.114094 | 0.499385 | 23.823177 | 0.586935 | 54.103871 | true | false |
2025-03-04
|
2025-03-11
| 1 |
rootxhacker/Apollo_v2-32B (Merge)
|
rootxhacker_apollo-7B_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/rootxhacker/apollo-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rootxhacker/apollo-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rootxhacker__apollo-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
rootxhacker/apollo-7B
|
778170316e44277245135f1ed6a6ff7f0ca6725e
| 10.721176 |
mit
| 0 | 7.616 | true | false | false | false | 1.468812 | 0.295333 | 29.533305 | 0.363626 | 11.072694 | 0.02568 | 2.567976 | 0.278523 | 3.803132 | 0.413125 | 9.040625 | 0.174784 | 8.309323 | true | false |
2025-03-02
|
2025-03-02
| 1 |
rootxhacker/apollo-7B (Merge)
|
rsh345_mistral-ft-optimized-1218-NeuralHermes-2.5-Mistral-7B_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/rsh345/mistral-ft-optimized-1218-NeuralHermes-2.5-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rsh345/mistral-ft-optimized-1218-NeuralHermes-2.5-Mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rsh345__mistral-ft-optimized-1218-NeuralHermes-2.5-Mistral-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
rsh345/mistral-ft-optimized-1218-NeuralHermes-2.5-Mistral-7B
|
9bd6ed02533f746473c7e8b926379d858e619925
| 21.027403 | 0 | 7.242 | false | false | false | false | 0.952348 | 0.389181 | 38.918071 | 0.518844 | 32.789744 | 0.073263 | 7.326284 | 0.302852 | 7.04698 | 0.467198 | 17.266406 | 0.305352 | 22.816933 | false | false |
2025-01-27
| 0 |
Removed
|
||
rubenroy_Geneva-12B-GCv2-5m_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/rubenroy/Geneva-12B-GCv2-5m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rubenroy/Geneva-12B-GCv2-5m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rubenroy__Geneva-12B-GCv2-5m-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
rubenroy/Geneva-12B-GCv2-5m
|
857f83b4043a3e28203a6b6bff19f0fad4cc1c83
| 16.956631 |
apache-2.0
| 12 | 12.248 | true | false | false | false | 0.886204 | 0.258638 | 25.863819 | 0.527837 | 32.646831 | 0.08006 | 8.006042 | 0.287752 | 5.033557 | 0.352479 | 5.193229 | 0.324967 | 24.996306 | false | false |
2025-02-01
|
2025-02-27
| 2 |
mistralai/Mistral-Nemo-Base-2407
|
rubenroy_Gilgamesh-72B_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/rubenroy/Gilgamesh-72B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rubenroy/Gilgamesh-72B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rubenroy__Gilgamesh-72B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
rubenroy/Gilgamesh-72B
|
5aa7df9b748abcbda03e8eb69b64348e09cd72e3
| 46.793672 |
other
| 8 | 72.706 | true | false | false | true | 51.291783 | 0.848601 | 84.86006 | 0.725333 | 61.836021 | 0.438066 | 43.806647 | 0.394295 | 19.239374 | 0.462646 | 17.664062 | 0.580203 | 53.355866 | false | false |
2025-02-01
|
2025-02-25
| 1 |
rubenroy/Gilgamesh-72B (Merge)
|
rubenroy_Zurich-14B-GCv2-5m_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/rubenroy/Zurich-14B-GCv2-5m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rubenroy/Zurich-14B-GCv2-5m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rubenroy__Zurich-14B-GCv2-5m-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
rubenroy/Zurich-14B-GCv2-5m
|
08f86f70e83f376f963dd2f21b5a15cc6cf8f17b
| 37.063689 |
apache-2.0
| 12 | 14.77 | true | false | false | false | 1.933762 | 0.616368 | 61.63679 | 0.630836 | 46.73374 | 0.307402 | 30.740181 | 0.361577 | 14.876957 | 0.487448 | 21.364323 | 0.523271 | 47.030142 | false | false |
2025-01-31
|
2025-02-27
| 2 |
Qwen/Qwen2.5-14B
|
ruizhe1217_sft-s1-qwen-0.5b_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ruizhe1217/sft-s1-qwen-0.5b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ruizhe1217/sft-s1-qwen-0.5b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ruizhe1217__sft-s1-qwen-0.5b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ruizhe1217/sft-s1-qwen-0.5b
|
2f8e051a801011cc906efe56c535aab5608aa341
| 9.240286 |
apache-2.0
| 0 | 0.494 | true | false | false | false | 0.535855 | 0.274875 | 27.487511 | 0.330054 | 8.276264 | 0.061934 | 6.193353 | 0.270973 | 2.796421 | 0.319583 | 0.78125 | 0.189162 | 9.906915 | false | false |
2025-02-26
|
2025-02-27
| 1 |
Qwen/Qwen2.5-0.5B
|
rwitz_go-bruins-v2_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/rwitz/go-bruins-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rwitz/go-bruins-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rwitz__go-bruins-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
rwitz/go-bruins-v2
|
6d9e57d3a36dbad364ec77ca642873d9fc7fd61c
| 15.433967 | 0 | 7.242 | false | false | false | true | 1.275639 | 0.409589 | 40.958878 | 0.379884 | 12.69326 | 0.067221 | 6.722054 | 0.262584 | 1.677852 | 0.41375 | 10.985417 | 0.276097 | 19.566342 | false | false |
2024-06-26
| 0 |
Removed
|
||
sabersaleh_Llama2-7B-CPO_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/sabersaleh/Llama2-7B-CPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sabersaleh/Llama2-7B-CPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sabersaleh__Llama2-7B-CPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sabersaleh/Llama2-7B-CPO
|
cfc39fd915d4cb89283a901f0eed60f268ec8dce
| 7.30319 |
mit
| 0 | 7 | true | false | false | false | 0.954418 | 0.154549 | 15.454882 | 0.345792 | 8.656016 | 0.013595 | 1.359517 | 0.267617 | 2.348993 | 0.404823 | 9.269531 | 0.160572 | 6.730201 | false | false |
2024-11-30
|
2024-11-30
| 1 |
sabersaleh/Llama2-7B-CPO (Merge)
|
sabersaleh_Llama2-7B-DPO_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/sabersaleh/Llama2-7B-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sabersaleh/Llama2-7B-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sabersaleh__Llama2-7B-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sabersaleh/Llama2-7B-DPO
|
e07f7224c0ecd95eb8c82ae28e00c32031258942
| 7.558005 |
mit
| 0 | 7 | true | false | false | false | 0.83512 | 0.145331 | 14.533105 | 0.351222 | 9.362231 | 0.015861 | 1.586103 | 0.268456 | 2.46085 | 0.411365 | 10.453906 | 0.162566 | 6.951832 | false | false |
2024-11-30
|
2024-11-30
| 1 |
sabersaleh/Llama2-7B-DPO (Merge)
|
sabersaleh_Llama2-7B-IPO_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/sabersaleh/Llama2-7B-IPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sabersaleh/Llama2-7B-IPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sabersaleh__Llama2-7B-IPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sabersaleh/Llama2-7B-IPO
|
424beb187852f704718d75cf9f2ac6c63e10d941
| 7.804715 |
mit
| 0 | 7 | true | false | false | false | 0.852331 | 0.176855 | 17.685519 | 0.347455 | 9.019805 | 0.015861 | 1.586103 | 0.267617 | 2.348993 | 0.40476 | 9.328385 | 0.161735 | 6.859486 | false | false |
2024-11-30
|
2024-11-30
| 1 |
sabersaleh/Llama2-7B-IPO (Merge)
|
sabersaleh_Llama2-7B-KTO_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/sabersaleh/Llama2-7B-KTO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sabersaleh/Llama2-7B-KTO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sabersaleh__Llama2-7B-KTO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sabersaleh/Llama2-7B-KTO
|
60ebb9b532251942686b0cd79cbf56e6694f6e0c
| 7.882716 |
mit
| 0 | 7 | true | false | false | false | 1.256058 | 0.15285 | 15.284999 | 0.350076 | 9.514964 | 0.018882 | 1.888218 | 0.267617 | 2.348993 | 0.416698 | 11.18724 | 0.163647 | 7.071882 | false | false |
2024-11-30
|
2024-11-30
| 1 |
sabersaleh/Llama2-7B-KTO (Merge)
|
sabersaleh_Llama2-7B-SPO_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/sabersaleh/Llama2-7B-SPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sabersaleh/Llama2-7B-SPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sabersaleh__Llama2-7B-SPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sabersaleh/Llama2-7B-SPO
|
710558a6e70820a1d2f23823380a4accfed4d9b6
| 7.352632 |
mit
| 0 | 7 | true | false | false | false | 0.815769 | 0.156672 | 15.667207 | 0.33834 | 7.766131 | 0.019637 | 1.963746 | 0.276846 | 3.579418 | 0.387427 | 6.728385 | 0.175698 | 8.410904 | false | false |
2024-11-30
|
2024-11-30
| 1 |
sabersaleh/Llama2-7B-SPO (Merge)
|
sabersaleh_Llama2-7B-SimPO_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/sabersaleh/Llama2-7B-SimPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sabersaleh/Llama2-7B-SimPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sabersaleh__Llama2-7B-SimPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sabersaleh/Llama2-7B-SimPO
|
860de39d93c457d719c3f299e06ba4897aa51f3d
| 7.610783 |
mit
| 0 | 7 | true | false | false | false | 0.947604 | 0.165864 | 16.586435 | 0.348916 | 8.981211 | 0.015861 | 1.586103 | 0.270973 | 2.796421 | 0.400698 | 8.58724 | 0.164146 | 7.12729 | false | false |
2024-11-30
|
2024-11-30
| 1 |
sabersaleh/Llama2-7B-SimPO (Merge)
|
sabersaleh_Llama3_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/sabersaleh/Llama3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sabersaleh/Llama3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sabersaleh__Llama3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sabersaleh/Llama3
|
56cdeb0c32b330835c4a88f480066e0308ecf127
| 17.458609 |
mit
| 0 | 8.03 | true | false | false | false | 1.393266 | 0.332078 | 33.207778 | 0.478219 | 26.706794 | 0.056647 | 5.664653 | 0.310403 | 8.053691 | 0.393344 | 7.101302 | 0.316157 | 24.017435 | false | false |
2024-11-29
|
2024-11-29
| 1 |
sabersaleh/Llama3 (Merge)
|
sabersalehk_Llama3-001-300_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/sabersalehk/Llama3-001-300" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sabersalehk/Llama3-001-300</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sabersalehk__Llama3-001-300-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sabersalehk/Llama3-001-300
|
17152ae8544f09f2fa25ae276d1d56ca3302e631
| 17.121077 |
mit
| 0 | 8.03 | true | false | false | false | 1.399066 | 0.317864 | 31.786438 | 0.474458 | 25.706819 | 0.05287 | 5.287009 | 0.299497 | 6.599553 | 0.406396 | 9.366146 | 0.315824 | 23.980496 | false | false |
2024-12-03
|
2024-12-03
| 1 |
sabersalehk/Llama3-001-300 (Merge)
|
sabersalehk_Llama3-SimPO_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/sabersalehk/Llama3-SimPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sabersalehk/Llama3-SimPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sabersalehk__Llama3-SimPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sabersalehk/Llama3-SimPO
|
022f6b4f31728945bc031e2cbf1ed461a8148642
| 18.716098 |
mit
| 0 | 8.03 | true | false | false | false | 1.405379 | 0.364201 | 36.420143 | 0.487354 | 27.448562 | 0.057402 | 5.740181 | 0.307886 | 7.718121 | 0.404594 | 11.007552 | 0.315658 | 23.962027 | false | false |
2024-12-02
|
2024-12-02
| 1 |
sabersalehk/Llama3-SimPO (Merge)
|
sabersalehk_Llama3_001_200_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/sabersalehk/Llama3_001_200" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sabersalehk/Llama3_001_200</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sabersalehk__Llama3_001_200-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sabersalehk/Llama3_001_200
|
f0a8d4ac002abf89f19a43c32d945b415b7bfe5d
| 17.287941 |
mit
| 0 | 8.03 | true | false | false | false | 1.452168 | 0.321836 | 32.183606 | 0.472792 | 25.625568 | 0.05136 | 5.135952 | 0.303691 | 7.158837 | 0.403729 | 9.366146 | 0.318318 | 24.257535 | false | false |
2024-12-03
|
2024-12-03
| 1 |
sabersalehk/Llama3_001_200 (Merge)
|
sabersalehk_Llama3_01_300_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/sabersalehk/Llama3_01_300" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sabersalehk/Llama3_01_300</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sabersalehk__Llama3_01_300-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sabersalehk/Llama3_01_300
|
159a25046a6bc0be3706e9a49389de4b72b21707
| 16.698824 |
mit
| 0 | 8.03 | true | false | false | false | 1.448543 | 0.295883 | 29.58827 | 0.469139 | 25.15525 | 0.049849 | 4.984894 | 0.307886 | 7.718121 | 0.40649 | 9.144531 | 0.312417 | 23.601876 | false | false |
2024-12-03
|
2024-12-03
| 1 |
sabersalehk/Llama3_01_300 (Merge)
|
saishf_Fimbulvetr-Kuro-Lotus-10.7B_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/saishf/Fimbulvetr-Kuro-Lotus-10.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saishf/Fimbulvetr-Kuro-Lotus-10.7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/saishf__Fimbulvetr-Kuro-Lotus-10.7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
saishf/Fimbulvetr-Kuro-Lotus-10.7B
|
ec1288fd8c06ac408a2a7e503ea62ac300e474e1
| 20.677867 |
cc-by-nc-4.0
| 18 | 10.732 | true | false | false | true | 1.618337 | 0.493944 | 49.394385 | 0.434232 | 19.908821 | 0.053625 | 5.362538 | 0.301174 | 6.823266 | 0.44451 | 16.030469 | 0.33893 | 26.547725 | true | false |
2024-02-13
|
2024-07-09
| 1 |
saishf/Fimbulvetr-Kuro-Lotus-10.7B (Merge)
|
saishf_Neural-SOVLish-Devil-8B-L3_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/saishf/Neural-SOVLish-Devil-8B-L3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saishf/Neural-SOVLish-Devil-8B-L3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/saishf__Neural-SOVLish-Devil-8B-L3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
saishf/Neural-SOVLish-Devil-8B-L3
|
3df738f6b3512f5f9571f862811717e1fc8c36b6
| 21.691331 |
cc-by-nc-4.0
| 10 | 8.03 | true | false | false | false | 1.379756 | 0.41988 | 41.988036 | 0.51418 | 30.100237 | 0.089124 | 8.912387 | 0.307886 | 7.718121 | 0.410958 | 10.236458 | 0.380735 | 31.192745 | true | false |
2024-05-28
|
2025-02-02
| 1 |
saishf/Neural-SOVLish-Devil-8B-L3 (Merge)
|
saishshinde15_TethysAI_Base_Reasoning_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/saishshinde15/TethysAI_Base_Reasoning" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saishshinde15/TethysAI_Base_Reasoning</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/saishshinde15__TethysAI_Base_Reasoning-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
saishshinde15/TethysAI_Base_Reasoning
|
6c3b2772655a55e5b8e30265b985a9ee84cdb6e8
| 26.354839 |
apache-2.0
| 1 | 3.086 | true | false | false | true | 0.735945 | 0.636876 | 63.687571 | 0.451856 | 23.597503 | 0.314199 | 31.41994 | 0.286074 | 4.809843 | 0.407458 | 9.765625 | 0.323637 | 24.848552 | false | false |
2025-02-21
|
2025-02-25
| 1 |
saishshinde15/TethysAI_Base_Reasoning (Merge)
|
saishshinde15_TethysAI_Vortex_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/saishshinde15/TethysAI_Vortex" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saishshinde15/TethysAI_Vortex</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/saishshinde15__TethysAI_Vortex-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
saishshinde15/TethysAI_Vortex
|
9209d07a2dd5aa2226e4bde09cfeb30f5ed70c8d
| 24.803373 | 0 | 3.086 | false | false | false | true | 0.743528 | 0.429772 | 42.977189 | 0.474926 | 26.914314 | 0.314955 | 31.495468 | 0.305369 | 7.38255 | 0.445781 | 15.15599 | 0.324053 | 24.894725 | false | false |
2025-02-25
| 0 |
Removed
|
||
saishshinde15_TethysAI_Vortex_Reasoning_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/saishshinde15/TethysAI_Vortex_Reasoning" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saishshinde15/TethysAI_Vortex_Reasoning</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/saishshinde15__TethysAI_Vortex_Reasoning-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
saishshinde15/TethysAI_Vortex_Reasoning
|
97e0c62c764fad13fcba5735c2d6564ee01e3951
| 21.791498 |
apache-2.0
| 1 | 3.086 | true | false | false | false | 0.78844 | 0.40212 | 40.211971 | 0.469381 | 25.738138 | 0.214502 | 21.450151 | 0.30453 | 7.270694 | 0.408448 | 9.622656 | 0.338098 | 26.455378 | false | false |
2025-02-24
|
2025-02-25
| 1 |
saishshinde15/TethysAI_Vortex_Reasoning (Merge)
|
sakaltcommunity_novablast-preview_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sakaltcommunity/novablast-preview" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sakaltcommunity/novablast-preview</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sakaltcommunity__novablast-preview-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sakaltcommunity/novablast-preview
|
a9cb798ec06f69ca67f4645020ba2d0eee4ffd58
| 41.516418 |
apache-2.0
| 0 | 32.764 | true | false | false | false | 11.262021 | 0.453028 | 45.302797 | 0.704277 | 58.18216 | 0.489426 | 48.942598 | 0.381711 | 17.561521 | 0.502115 | 24.497656 | 0.591506 | 54.611776 | true | false |
2024-12-14
|
2024-12-18
| 1 |
sakaltcommunity/novablast-preview (Merge)
|
sakaltcommunity_sakaltum-7b_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/sakaltcommunity/sakaltum-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sakaltcommunity/sakaltum-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sakaltcommunity__sakaltum-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sakaltcommunity/sakaltum-7b
|
692d1c3efdae68a3ace336d865daceb713b93130
| 13.528324 |
mit
| 0 | 7.242 | true | false | false | false | 0.944304 | 0.260387 | 26.038688 | 0.457521 | 23.752645 | 0.029456 | 2.945619 | 0.272651 | 3.020134 | 0.3775 | 5.754167 | 0.276928 | 19.658688 | true | false |
2024-12-13
|
2024-12-13
| 1 |
sakaltcommunity/sakaltum-7b (Merge)
|
sakhan10_quantized_open_llama_3b_v2_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/sakhan10/quantized_open_llama_3b_v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sakhan10/quantized_open_llama_3b_v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sakhan10__quantized_open_llama_3b_v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sakhan10/quantized_open_llama_3b_v2
|
e8d51ad5204806edf9c2eeb8c56139a440a70265
| 5.1425 | 0 | 3 | false | false | false | false | 0.785401 | 0.187222 | 18.722213 | 0.30198 | 2.805733 | 0 | 0 | 0.276846 | 3.579418 | 0.368167 | 4.6875 | 0.109541 | 1.060136 | false | false |
2024-08-23
|
2024-08-28
| 1 |
openlm-research/open_llama_3b_v2
|
|
saltlux_luxia-21.4b-alignment-v1.0_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/saltlux/luxia-21.4b-alignment-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saltlux/luxia-21.4b-alignment-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/saltlux__luxia-21.4b-alignment-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
saltlux/luxia-21.4b-alignment-v1.0
|
87d5673e6d9f60462f195e9414a0bf6874c89ceb
| 23.454574 |
apache-2.0
| 33 | 21.421 | true | false | false | true | 3.488095 | 0.369297 | 36.92968 | 0.637334 | 48.021113 | 0.097432 | 9.743202 | 0.301174 | 6.823266 | 0.432844 | 12.505469 | 0.340342 | 26.704713 | false | false |
2024-03-12
|
2024-06-29
| 0 |
saltlux/luxia-21.4b-alignment-v1.0
|
saltlux_luxia-21.4b-alignment-v1.2_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/saltlux/luxia-21.4b-alignment-v1.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">saltlux/luxia-21.4b-alignment-v1.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/saltlux__luxia-21.4b-alignment-v1.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
saltlux/luxia-21.4b-alignment-v1.2
|
eed12b5574fa49cc81e57a88aff24c08c13721c0
| 24.58071 |
apache-2.0
| 9 | 21.421 | true | false | false | true | 4.091852 | 0.411537 | 41.153694 | 0.637118 | 47.769165 | 0.084592 | 8.459215 | 0.307886 | 7.718121 | 0.445896 | 14.903646 | 0.347324 | 27.480423 | false | false |
2024-05-27
|
2024-07-30
| 0 |
saltlux/luxia-21.4b-alignment-v1.2
|
sam-paech_Darkest-muse-v1_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sam-paech/Darkest-muse-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sam-paech/Darkest-muse-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sam-paech__Darkest-muse-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sam-paech/Darkest-muse-v1
|
55f6ba0218e9615d18a76f244a874b941f8c434f
| 33.447324 |
apache-2.0
| 67 | 10.159 | true | false | false | false | 4.413895 | 0.73442 | 73.442023 | 0.596844 | 42.611731 | 0.214502 | 21.450151 | 0.34396 | 12.527964 | 0.450208 | 15.276042 | 0.418384 | 35.376034 | false | false |
2024-10-22
|
2024-10-26
| 1 |
sam-paech/Darkest-muse-v1 (Merge)
|
sam-paech_Delirium-v1_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sam-paech/Delirium-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sam-paech/Delirium-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sam-paech__Delirium-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sam-paech/Delirium-v1
|
98dc2dad47af405013c0584d752504ca448bd8eb
| 33.091835 |
gemma
| 17 | 9.242 | true | false | false | false | 4.791002 | 0.720756 | 72.075648 | 0.596211 | 42.315079 | 0.210725 | 21.072508 | 0.343121 | 12.416107 | 0.451448 | 15.23099 | 0.418966 | 35.440677 | false | false |
2024-10-17
|
2024-10-26
| 1 |
unsloth/gemma-2-9b-it
|
sam-paech_Quill-v1_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sam-paech/Quill-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sam-paech/Quill-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sam-paech__Quill-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sam-paech/Quill-v1
|
3cab1cac9d3de0d25b48ea86b4533aa220231f20
| 33.063947 | 13 | 9.242 | false | false | false | false | 4.626938 | 0.712214 | 71.221359 | 0.596923 | 42.597669 | 0.212236 | 21.223565 | 0.339765 | 11.96868 | 0.455479 | 16.134896 | 0.417138 | 35.237515 | false | false |
2024-10-20
|
2024-10-26
| 1 |
sam-paech/Quill-v1 (Merge)
|
|
sarvamai_OpenHathi-7B-Hi-v0.1-Base_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/sarvamai/OpenHathi-7B-Hi-v0.1-Base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sarvamai/OpenHathi-7B-Hi-v0.1-Base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sarvamai__OpenHathi-7B-Hi-v0.1-Base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sarvamai/OpenHathi-7B-Hi-v0.1-Base
|
2cb5807b852028defa07c56c96a7ff5c11f8df0e
| 6.338694 |
llama2
| 109 | 6.87 | true | false | false | false | 1.036454 | 0.180402 | 18.040244 | 0.335405 | 7.645607 | 0.008308 | 0.830816 | 0.253356 | 0.447427 | 0.365844 | 5.030469 | 0.154338 | 6.037603 | false | false |
2023-12-13
|
2025-02-06
| 0 |
sarvamai/OpenHathi-7B-Hi-v0.1-Base
|
schnapss_testmerge-7b_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/schnapss/testmerge-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">schnapss/testmerge-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/schnapss__testmerge-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
schnapss/testmerge-7b
|
ff84f5b87ba51db9622b1c553c076533890a8f50
| 20.913446 | 0 | 7.242 | false | false | false | false | 0.940309 | 0.392228 | 39.222818 | 0.518748 | 32.638166 | 0.068731 | 6.873112 | 0.296141 | 6.152125 | 0.468563 | 17.703646 | 0.306017 | 22.89081 | false | false |
2024-11-16
|
2024-11-16
| 1 |
schnapss/testmerge-7b (Merge)
|
|
sci-m-wang_Mistral-7B-Instruct-sa-v0.1_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Adapter
|
?
|
<a target="_blank" href="https://huggingface.co/sci-m-wang/Mistral-7B-Instruct-sa-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sci-m-wang/Mistral-7B-Instruct-sa-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sci-m-wang__Mistral-7B-Instruct-sa-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sci-m-wang/Mistral-7B-Instruct-sa-v0.1
|
2dcff66eac0c01dc50e4c41eea959968232187fe
| 12.263005 |
other
| 0 | 14.483 | true | false | false | true | 1.530165 | 0.433519 | 43.351862 | 0.327278 | 5.743646 | 0.01435 | 1.435045 | 0.259228 | 1.230425 | 0.39 | 6.683333 | 0.236203 | 15.133717 | false | false |
2024-05-31
|
2024-06-27
| 2 |
mistralai/Mistral-7B-v0.1
|
sci-m-wang_Phi-3-mini-4k-instruct-sa-v0.1_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Adapter
|
?
|
<a target="_blank" href="https://huggingface.co/sci-m-wang/Phi-3-mini-4k-instruct-sa-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sci-m-wang/Phi-3-mini-4k-instruct-sa-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sci-m-wang__Phi-3-mini-4k-instruct-sa-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sci-m-wang/Phi-3-mini-4k-instruct-sa-v0.1
|
5a516f86087853f9d560c95eb9209c1d4ed9ff69
| 25.824145 |
other
| 0 | 7.642 | true | false | false | true | 2.561005 | 0.502062 | 50.206231 | 0.550204 | 36.605419 | 0.148036 | 14.803625 | 0.328859 | 10.514541 | 0.407302 | 9.646094 | 0.398521 | 33.168957 | false | false |
2024-06-01
|
2024-06-27
| 1 |
microsoft/Phi-3-mini-4k-instruct
|
sci-m-wang_deepseek-llm-7b-chat-sa-v0.1_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Adapter
|
?
|
<a target="_blank" href="https://huggingface.co/sci-m-wang/deepseek-llm-7b-chat-sa-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sci-m-wang/deepseek-llm-7b-chat-sa-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sci-m-wang__deepseek-llm-7b-chat-sa-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sci-m-wang/deepseek-llm-7b-chat-sa-v0.1
|
afbda8b347ec881666061fa67447046fc5164ec8
| 13.20805 |
other
| 0 | 7 | true | false | false | true | 1.983148 | 0.403594 | 40.359358 | 0.371772 | 12.051975 | 0.026435 | 2.643505 | 0.256711 | 0.894855 | 0.417313 | 9.864062 | 0.220911 | 13.434545 | false | false |
2024-05-31
|
2024-06-27
| 1 |
deepseek-ai/deepseek-llm-7b-chat
|
securin_Securin-LLM-V2.5-Qwen-1.5B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/securin/Securin-LLM-V2.5-Qwen-1.5B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">securin/Securin-LLM-V2.5-Qwen-1.5B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/securin__Securin-LLM-V2.5-Qwen-1.5B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
securin/Securin-LLM-V2.5-Qwen-1.5B
|
8d14c68eec2049d59b2f3262b323c6036754864c
| 5.22568 | 0 | 1.543 | false | false | false | false | 1.209571 | 0.149203 | 14.9203 | 0.315842 | 4.863456 | 0.024924 | 2.492447 | 0.25 | 0 | 0.360635 | 2.246094 | 0.161486 | 6.831782 | false | false |
2024-12-15
|
2024-12-08
| 1 |
securin/Securin-LLM-V2.5-Qwen-1.5B (Merge)
|
|
senseable_WestLake-7B-v2_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/senseable/WestLake-7B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">senseable/WestLake-7B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/senseable__WestLake-7B-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
senseable/WestLake-7B-v2
|
41625004c47628837678859753b94c50c82f3bec
| 16.257065 |
apache-2.0
| 111 | 7.242 | true | false | false | true | 1.262023 | 0.441862 | 44.186204 | 0.407328 | 17.858142 | 0.048338 | 4.833837 | 0.276846 | 3.579418 | 0.393719 | 7.48151 | 0.27643 | 19.60328 | false | false |
2024-01-22
|
2024-07-23
| 0 |
senseable/WestLake-7B-v2
|
sequelbox_Llama3.1-70B-PlumChat_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/sequelbox/Llama3.1-70B-PlumChat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sequelbox/Llama3.1-70B-PlumChat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sequelbox__Llama3.1-70B-PlumChat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sequelbox/Llama3.1-70B-PlumChat
|
bef139c3f9ee73c32559518b951d0465ab36190c
| 37.409206 |
llama3.1
| 1 | 70.554 | true | false | false | false | 65.544135 | 0.561613 | 56.161319 | 0.675282 | 52.812752 | 0.30287 | 30.287009 | 0.39094 | 18.791946 | 0.477375 | 20.138542 | 0.516373 | 46.263667 | true | false |
2024-11-17
|
2024-11-27
| 1 |
sequelbox/Llama3.1-70B-PlumChat (Merge)
|
sequelbox_Llama3.1-8B-MOTH_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/sequelbox/Llama3.1-8B-MOTH" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sequelbox/Llama3.1-8B-MOTH</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sequelbox__Llama3.1-8B-MOTH-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sequelbox/Llama3.1-8B-MOTH
|
8db363e36b1efc9015ab14648e68bcfba9e8d8a0
| 20.836504 |
llama3.1
| 1 | 8.03 | true | false | false | true | 2.929274 | 0.524494 | 52.44939 | 0.490247 | 27.916332 | 0.121601 | 12.160121 | 0.268456 | 2.46085 | 0.368917 | 4.047917 | 0.33386 | 25.984412 | false | false |
2024-09-01
|
2024-09-19
| 2 |
meta-llama/Meta-Llama-3.1-8B
|
sequelbox_Llama3.1-8B-PlumChat_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/sequelbox/Llama3.1-8B-PlumChat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sequelbox/Llama3.1-8B-PlumChat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sequelbox__Llama3.1-8B-PlumChat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sequelbox/Llama3.1-8B-PlumChat
|
1afdc9856591f573e4fcb52dba19a9d8da631e0b
| 13.21473 |
llama3.1
| 0 | 8.03 | true | false | false | true | 1.978514 | 0.424276 | 42.427648 | 0.387329 | 13.935991 | 0.036254 | 3.625378 | 0.265101 | 2.013423 | 0.375458 | 4.765625 | 0.212683 | 12.520316 | true | false |
2024-10-02
|
2024-10-03
| 1 |
sequelbox/Llama3.1-8B-PlumChat (Merge)
|
sequelbox_Llama3.1-8B-PlumCode_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/sequelbox/Llama3.1-8B-PlumCode" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sequelbox/Llama3.1-8B-PlumCode</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sequelbox__Llama3.1-8B-PlumCode-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sequelbox/Llama3.1-8B-PlumCode
|
171cd599d574000607491f08e6cf7b7eb199e33d
| 9.824 |
llama3.1
| 0 | 8.03 | true | false | false | false | 1.781352 | 0.204483 | 20.448299 | 0.336809 | 8.502927 | 0.02719 | 2.719033 | 0.276007 | 3.467562 | 0.377344 | 8.967969 | 0.233544 | 14.838209 | true | false |
2024-10-02
|
2024-10-03
| 1 |
sequelbox/Llama3.1-8B-PlumCode (Merge)
|
sequelbox_Llama3.1-8B-PlumMath_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/sequelbox/Llama3.1-8B-PlumMath" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sequelbox/Llama3.1-8B-PlumMath</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sequelbox__Llama3.1-8B-PlumMath-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sequelbox/Llama3.1-8B-PlumMath
|
b857c30a626f7c020fcba89df7bece4bb7381ac2
| 13.936685 |
llama3.1
| 1 | 8.03 | true | false | false | false | 1.737544 | 0.224242 | 22.424168 | 0.40323 | 16.446584 | 0.047583 | 4.758308 | 0.317953 | 9.060403 | 0.391854 | 8.981771 | 0.29754 | 21.948877 | true | false |
2024-10-01
|
2024-10-03
| 1 |
sequelbox/Llama3.1-8B-PlumMath (Merge)
|
sequelbox_gemma-2-9B-MOTH_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sequelbox/gemma-2-9B-MOTH" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sequelbox/gemma-2-9B-MOTH</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sequelbox__gemma-2-9B-MOTH-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sequelbox/gemma-2-9B-MOTH
|
8dff98ab82ba0087706afa0d6c69874a45548212
| 4.729558 |
gemma
| 0 | 9.242 | true | false | false | true | 6.055898 | 0.205882 | 20.588151 | 0.30797 | 3.212217 | 0.010574 | 1.057402 | 0.260067 | 1.342282 | 0.340948 | 0.61849 | 0.114029 | 1.558806 | false | false |
2024-09-09
|
2024-09-10
| 2 |
google/gemma-2-9b
|
sethuiyer_Llama-3.1-8B-Experimental-1206-Instruct_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/sethuiyer/Llama-3.1-8B-Experimental-1206-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sethuiyer/Llama-3.1-8B-Experimental-1206-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sethuiyer__Llama-3.1-8B-Experimental-1206-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sethuiyer/Llama-3.1-8B-Experimental-1206-Instruct
|
5dde2c4f0f907b00cc490a6b1fe492697395eff3
| 25.684511 | 1 | 8.03 | false | false | false | true | 1.370784 | 0.696701 | 69.670142 | 0.510381 | 30.055034 | 0.111782 | 11.178248 | 0.299497 | 6.599553 | 0.396573 | 8.504948 | 0.352892 | 28.099143 | false | false |
2025-01-18
|
2025-01-19
| 1 |
sethuiyer/Llama-3.1-8B-Experimental-1206-Instruct (Merge)
|
|
sethuiyer_Llama-3.1-8B-Experimental-1208-Instruct_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/sethuiyer/Llama-3.1-8B-Experimental-1208-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sethuiyer/Llama-3.1-8B-Experimental-1208-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sethuiyer__Llama-3.1-8B-Experimental-1208-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sethuiyer/Llama-3.1-8B-Experimental-1208-Instruct
|
4afc818bdd3890a71ac8c31bde9e424e43a86bd7
| 23.50982 | 0 | 8.03 | false | false | false | true | 1.513561 | 0.609998 | 60.999814 | 0.496423 | 29.491581 | 0.089124 | 8.912387 | 0.296141 | 6.152125 | 0.37899 | 7.607031 | 0.351064 | 27.895981 | false | false |
2025-01-18
| 0 |
Removed
|
||
sethuiyer_LlamaZero-3.1-8B-Experimental-1208_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/sethuiyer/LlamaZero-3.1-8B-Experimental-1208" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sethuiyer/LlamaZero-3.1-8B-Experimental-1208</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sethuiyer__LlamaZero-3.1-8B-Experimental-1208-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sethuiyer/LlamaZero-3.1-8B-Experimental-1208
|
8210bbb6d9284b11e168a184e0d6b68c58e419b0
| 21.958508 | 0 | 8.03 | false | false | false | true | 1.746595 | 0.605102 | 60.510224 | 0.498137 | 28.612688 | 0.108006 | 10.800604 | 0.268456 | 2.46085 | 0.382 | 7.15 | 0.29995 | 22.216681 | false | false |
2025-01-20
|
2025-01-20
| 1 |
sethuiyer/LlamaZero-3.1-8B-Experimental-1208 (Merge)
|
Subsets and Splits
Top 100 Official Models <70
This query identifies the top 100 high-scoring, officially provided models with fewer than 70 billion parameters, offering a useful overview for comparing performance metrics.
Top 100 Official Models < 2
Identifies top-performing AI models with fewer than 20 billion parameters, offering insights into efficiency and precision in smaller models.
Top 500 Official Models by Score
Identifies top performing models based on a combined score of IFEval and MMLU-PRO metrics, filtering by official providers and parameter count, offering insights into efficient model performance.
Top 200 Official Models by Score
Discovers top high-performing models with less than 70 billion parameters, highlighting their evaluation scores and characteristics, which is valuable for model selection and optimization.
SQL Console for open-llm-leaderboard/contents
Identifies top-performing models with fewer than 70 billion parameters, combining two evaluation metrics to reveal the best balanced options.
Top 10 Official Leaderboard Models
The query identifies top 10 official providers with under 13 billion parameters, ordered by their average metric, revealing valuable insights into efficient models.
SQL Console for open-llm-leaderboard/contents
This query filters and ranks models within a specific parameter range (6-8 billion) for the LlamaForCausalLM architecture based on their average performance metric.
SQL Console for open-llm-leaderboard/contents
Retrieves entries related to chat models that are officially provided, offering a filtered view of the dataset.
SQL Console for open-llm-leaderboard/contents
The query retrieves entries marked as "Official Providers", offering basic filtering but limited analytical value.
Top 10 Official Training Data
The query retrieves a small sample of records from the 'train' dataset where the "Official Providers" flag is true, providing basic filtering with limited analytical value.