eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
7 values
T
stringclasses
7 values
Weight type
stringclasses
3 values
Architecture
stringclasses
62 values
Model
stringlengths
355
689
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
0.74
52
Hub License
stringclasses
27 values
Hub ❤️
int64
0
5.99k
#Params (B)
float64
-1
141
Available on the hub
bool
2 classes
MoE
bool
2 classes
Flagged
bool
2 classes
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.03
107
IFEval Raw
float64
0
0.9
IFEval
float64
0
90
BBH Raw
float64
0.24
0.75
BBH
float64
0.25
64.1
MATH Lvl 5 Raw
float64
0
0.52
MATH Lvl 5
float64
0
52.4
GPQA Raw
float64
0.21
0.47
GPQA
float64
0
29.4
MUSR Raw
float64
0.29
0.6
MUSR
float64
0
38.5
MMLU-PRO Raw
float64
0.1
0.73
MMLU-PRO
float64
0
70
Merged
bool
2 classes
Official Providers
bool
2 classes
Upload To Hub Date
stringclasses
480 values
Submission Date
stringclasses
220 values
Generation
int64
0
10
Base Model
stringlengths
4
102
BlackBeenie_Llama-3.1-8B-OpenO1-SFT-v0.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BlackBeenie/Llama-3.1-8B-OpenO1-SFT-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BlackBeenie/Llama-3.1-8B-OpenO1-SFT-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BlackBeenie__Llama-3.1-8B-OpenO1-SFT-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BlackBeenie/Llama-3.1-8B-OpenO1-SFT-v0.1
35e7781b9dff5aea29576709201d641e5f44440d
21.035355
apache-2.0
1
8.03
true
false
false
true
0.731428
0.512404
51.240376
0.478745
26.03429
0.130665
13.066465
0.268456
2.46085
0.361813
5.726563
0.349152
27.683585
false
false
2024-12-28
2024-12-29
1
BlackBeenie/Llama-3.1-8B-OpenO1-SFT-v0.1 (Merge)
BlackBeenie_Llama-3.1-8B-pythonic-passthrough-merge_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BlackBeenie/Llama-3.1-8B-pythonic-passthrough-merge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BlackBeenie/Llama-3.1-8B-pythonic-passthrough-merge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BlackBeenie__Llama-3.1-8B-pythonic-passthrough-merge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BlackBeenie/Llama-3.1-8B-pythonic-passthrough-merge
3ec46616f5b34821b3b928938931295f92e49213
7.311462
0
20.245
false
false
false
false
3.58329
0.231586
23.158553
0.345385
9.359905
0.006042
0.60423
0.268456
2.46085
0.377812
4.593229
0.133228
3.692007
false
false
2024-11-06
2024-11-06
1
BlackBeenie/Llama-3.1-8B-pythonic-passthrough-merge (Merge)
BlackBeenie_Neos-Gemma-2-9b_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/BlackBeenie/Neos-Gemma-2-9b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BlackBeenie/Neos-Gemma-2-9b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BlackBeenie__Neos-Gemma-2-9b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BlackBeenie/Neos-Gemma-2-9b
56dbbb4f972be887e5b57311a8a32e148e98d154
25.211313
apache-2.0
1
9.242
true
false
false
true
2.679092
0.587567
58.756655
0.550298
35.638851
0.082326
8.232628
0.322987
9.731544
0.36175
5.785417
0.398105
33.122784
false
false
2024-11-11
2024-11-11
1
BlackBeenie/Neos-Gemma-2-9b (Merge)
BlackBeenie_Neos-Llama-3.1-8B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BlackBeenie/Neos-Llama-3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BlackBeenie/Neos-Llama-3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BlackBeenie__Neos-Llama-3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BlackBeenie/Neos-Llama-3.1-8B
9b48520ec1a777be0f1fd88f95454d85ac568407
19.461825
apache-2.0
1
8.03
true
false
false
true
0.793867
0.494394
49.439376
0.4425
21.080123
0.129154
12.915408
0.268456
2.46085
0.37499
5.740365
0.326213
25.134826
false
false
2024-11-12
2024-11-12
1
BlackBeenie/Neos-Llama-3.1-8B (Merge)
BlackBeenie_Neos-Llama-3.1-base_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BlackBeenie/Neos-Llama-3.1-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BlackBeenie/Neos-Llama-3.1-base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BlackBeenie__Neos-Llama-3.1-base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BlackBeenie/Neos-Llama-3.1-base
d4af4d73ba5fea0275fd1e3ba5102a79ac8009db
3.968795
0
4.65
false
false
false
true
1.409285
0.175082
17.508212
0.293034
2.221447
0
0
0.237416
0
0.349906
2.838281
0.111203
1.244829
false
false
2024-11-11
2024-11-11
0
BlackBeenie/Neos-Llama-3.1-base
BlackBeenie_Neos-Phi-3-14B-v0.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/BlackBeenie/Neos-Phi-3-14B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BlackBeenie/Neos-Phi-3-14B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BlackBeenie__Neos-Phi-3-14B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BlackBeenie/Neos-Phi-3-14B-v0.1
0afb7cc74a94f11f2695dc92788cdc6e28325f9c
26.843485
apache-2.0
0
13.96
true
false
false
true
0.909626
0.402245
40.224493
0.621193
46.631387
0.166918
16.691843
0.305369
7.38255
0.412542
10.534375
0.456366
39.596262
false
false
2024-11-27
2024-11-27
1
BlackBeenie/Neos-Phi-3-14B-v0.1 (Merge)
BlackBeenie_llama-3-luminous-merged_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BlackBeenie/llama-3-luminous-merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BlackBeenie/llama-3-luminous-merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BlackBeenie__llama-3-luminous-merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BlackBeenie/llama-3-luminous-merged
64288dd8e3305f2dc11d84fe0c653f351b2e8a9d
21.480108
0
8.03
false
false
false
false
0.763854
0.432345
43.234507
0.515392
30.643687
0.07855
7.854985
0.292785
5.704698
0.414896
10.628646
0.377327
30.814125
false
false
2024-09-15
2024-10-11
1
BlackBeenie/llama-3-luminous-merged (Merge)
BlackBeenie_llama-3.1-8B-Galore-openassistant-guanaco_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BlackBeenie/llama-3.1-8B-Galore-openassistant-guanaco" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BlackBeenie/llama-3.1-8B-Galore-openassistant-guanaco</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BlackBeenie__llama-3.1-8B-Galore-openassistant-guanaco-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BlackBeenie/llama-3.1-8B-Galore-openassistant-guanaco
828fa03c10e9085700b7abbe26f95067fab010fd
18.072101
1
8.03
false
false
false
false
0.85682
0.263484
26.348422
0.521337
31.444705
0.048338
4.833837
0.300336
6.711409
0.440625
14.578125
0.320645
24.516105
false
false
2024-10-16
2024-10-19
0
BlackBeenie/llama-3.1-8B-Galore-openassistant-guanaco
Bllossom_llama-3.2-Korean-Bllossom-AICA-5B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MllamaForConditionalGeneration
<a target="_blank" href="https://huggingface.co/Bllossom/llama-3.2-Korean-Bllossom-AICA-5B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Bllossom/llama-3.2-Korean-Bllossom-AICA-5B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Bllossom__llama-3.2-Korean-Bllossom-AICA-5B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Bllossom/llama-3.2-Korean-Bllossom-AICA-5B
4672b7de38c2cc390b146d6b6ce7a6dd295d8a0e
18.169448
llama3.2
53
5.199
true
false
false
true
0.610118
0.51725
51.724979
0.429307
18.650223
0.073263
7.326284
0.298658
6.487696
0.383396
5.824479
0.271027
19.003029
false
false
2024-12-12
2024-12-16
0
Bllossom/llama-3.2-Korean-Bllossom-AICA-5B
BoltMonkey_DreadMix_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BoltMonkey/DreadMix" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BoltMonkey/DreadMix</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BoltMonkey__DreadMix-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BoltMonkey/DreadMix
ab5dbaaff606538db73b6fd89aa169760104a566
28.661027
0
8.03
false
false
false
true
1.614205
0.709491
70.949082
0.54351
34.845015
0.149547
14.954683
0.299497
6.599553
0.421219
13.61901
0.378989
30.998818
false
false
2024-10-12
2024-10-13
1
BoltMonkey/DreadMix (Merge)
BoltMonkey_NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BoltMonkey__NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated
969e4c9b41e733a367f5ea18ed50a6171b5e2357
27.726282
llama3.1
2
8.03
true
false
false
true
1.640513
0.799891
79.989096
0.515199
30.7599
0.116314
11.63142
0.28104
4.138702
0.401875
9.467708
0.373338
30.370863
true
false
2024-10-01
2024-10-10
1
BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated (Merge)
BoltMonkey_NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BoltMonkey__NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated
969e4c9b41e733a367f5ea18ed50a6171b5e2357
21.345511
llama3.1
2
8.03
true
false
false
false
0.774319
0.459023
45.902317
0.518544
30.793785
0.093656
9.365559
0.274329
3.243848
0.40826
9.532552
0.363115
29.235003
true
false
2024-10-01
2024-10-01
1
BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated (Merge)
BoltMonkey_SuperNeuralDreadDevil-8b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BoltMonkey/SuperNeuralDreadDevil-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BoltMonkey/SuperNeuralDreadDevil-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BoltMonkey__SuperNeuralDreadDevil-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BoltMonkey/SuperNeuralDreadDevil-8b
804d5864127e603abec179a159b43f446246fafc
21.847726
1
8.03
false
false
false
true
2.405331
0.485801
48.580101
0.515108
30.606714
0.090634
9.063444
0.285235
4.697987
0.415948
10.426823
0.349402
27.711288
false
false
2024-10-13
2024-10-13
1
BoltMonkey/SuperNeuralDreadDevil-8b (Merge)
BrainWave-ML_llama3.2-3B-maths-orpo_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BrainWave-ML/llama3.2-3B-maths-orpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BrainWave-ML/llama3.2-3B-maths-orpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BrainWave-ML__llama3.2-3B-maths-orpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BrainWave-ML/llama3.2-3B-maths-orpo
d149d83d8e8f3883421d800848fec85766181923
5.076083
apache-2.0
2
3
true
false
false
false
0.707219
0.204907
20.490742
0.291178
2.347041
0
0
0.259228
1.230425
0.357531
4.52474
0.116772
1.863549
false
false
2024-10-24
2024-10-24
2
meta-llama/Llama-3.2-3B-Instruct
BramVanroy_GEITje-7B-ultra_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/BramVanroy/GEITje-7B-ultra" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BramVanroy/GEITje-7B-ultra</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BramVanroy__GEITje-7B-ultra-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BramVanroy/GEITje-7B-ultra
d4552cdc6f015754646464d8411aa4f6bcdba8e8
10.909606
cc-by-nc-4.0
40
7.242
true
false
false
true
0.619523
0.372344
37.234427
0.377616
12.879913
0.009063
0.906344
0.262584
1.677852
0.328979
1.522396
0.20113
11.236702
false
false
2024-01-27
2024-10-28
3
mistralai/Mistral-7B-v0.1
BramVanroy_fietje-2_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/BramVanroy/fietje-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BramVanroy/fietje-2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BramVanroy__fietje-2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BramVanroy/fietje-2
3abe75d01094b713368e3d911ffb78a2d66ead22
9.027007
mit
8
2.78
true
false
false
false
0.312539
0.209803
20.980332
0.403567
15.603676
0.009063
0.906344
0.254195
0.559284
0.369563
5.161979
0.198554
10.950428
false
false
2024-04-09
2024-10-28
1
microsoft/phi-2
BramVanroy_fietje-2-chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/BramVanroy/fietje-2-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BramVanroy/fietje-2-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BramVanroy__fietje-2-chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BramVanroy/fietje-2-chat
364e785d90438b787b94e33741a930c9932353c0
10.388869
mit
3
2.775
true
false
false
true
0.399033
0.291736
29.173593
0.414975
17.718966
0.005287
0.528701
0.239933
0
0.35276
3.195052
0.205452
11.716903
false
false
2024-04-29
2024-10-28
3
microsoft/phi-2
BramVanroy_fietje-2-instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/BramVanroy/fietje-2-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BramVanroy/fietje-2-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BramVanroy__fietje-2-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BramVanroy/fietje-2-instruct
b7b44797cd52eda1182667217e8371dbdfee4976
10.196192
mit
3
2.775
true
false
false
true
0.324395
0.278996
27.89964
0.413607
17.57248
0.005287
0.528701
0.233221
0
0.336917
2.914583
0.210356
12.261746
false
false
2024-04-27
2024-10-28
2
microsoft/phi-2
CarrotAI_Llama-3.2-Rabbit-Ko-3B-Instruct_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/CarrotAI/Llama-3.2-Rabbit-Ko-3B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CarrotAI/Llama-3.2-Rabbit-Ko-3B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CarrotAI__Llama-3.2-Rabbit-Ko-3B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CarrotAI/Llama-3.2-Rabbit-Ko-3B-Instruct
5be46c768d800447b82de41fdc9df2f8c43ba3c0
20.500891
llama3.2
8
3.213
true
false
false
true
0.567954
0.719882
71.988213
0.442672
21.49731
0.024924
2.492447
0.270973
2.796421
0.364917
3.98125
0.282247
20.249704
false
false
2024-09-30
2024-12-20
1
CarrotAI/Llama-3.2-Rabbit-Ko-3B-Instruct (Merge)
CarrotAI_Llama-3.2-Rabbit-Ko-3B-Instruct-2412_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/CarrotAI/Llama-3.2-Rabbit-Ko-3B-Instruct-2412" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CarrotAI/Llama-3.2-Rabbit-Ko-3B-Instruct-2412</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CarrotAI__Llama-3.2-Rabbit-Ko-3B-Instruct-2412-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CarrotAI/Llama-3.2-Rabbit-Ko-3B-Instruct-2412
ac6f1c0b756412163e17cb05d9e2f7ced274dc12
20.238815
llama3.2
1
3.213
true
false
false
false
0.643589
0.478182
47.818233
0.435772
20.17568
0.172205
17.220544
0.292785
5.704698
0.387208
6.801042
0.313414
23.712692
false
false
2024-12-03
2024-12-19
1
CarrotAI/Llama-3.2-Rabbit-Ko-3B-Instruct-2412 (Merge)
Casual-Autopsy_L3-Umbral-Mind-RP-v2.0-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Casual-Autopsy/L3-Umbral-Mind-RP-v2.0-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Casual-Autopsy/L3-Umbral-Mind-RP-v2.0-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Casual-Autopsy__L3-Umbral-Mind-RP-v2.0-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Casual-Autopsy/L3-Umbral-Mind-RP-v2.0-8B
b46c066ea8387264858dc3461f382e7b42fd9c48
25.911927
llama3
14
8.03
true
false
false
true
0.988385
0.712263
71.226346
0.526241
32.486278
0.110272
11.02719
0.286913
4.9217
0.368667
5.55
0.37234
30.260047
true
false
2024-06-26
2024-07-02
1
Casual-Autopsy/L3-Umbral-Mind-RP-v2.0-8B (Merge)
CausalLM_14B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/CausalLM/14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CausalLM/14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CausalLM__14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CausalLM/14B
cc054cf5953252d0709cb3267d1a85246e489e95
16.643939
wtfpl
299
14
true
false
false
false
0.996414
0.278821
27.882131
0.470046
24.780943
0.04003
4.003021
0.302852
7.04698
0.415479
11.468229
0.322141
24.682329
false
true
2023-10-22
2024-06-12
0
CausalLM/14B
CausalLM_34b-beta_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/CausalLM/34b-beta" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CausalLM/34b-beta</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CausalLM__34b-beta-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CausalLM/34b-beta
0429951eb30ccdfff3515e711aaa7649a8a7364c
23.285245
gpl-3.0
62
34.389
true
false
false
false
2.926596
0.304325
30.432475
0.5591
36.677226
0.047583
4.758308
0.346477
12.863535
0.374865
6.92474
0.532497
48.055186
false
true
2024-02-06
2024-06-26
0
CausalLM/34b-beta
CausalLM_preview-1-hf_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GlmForCausalLM
<a target="_blank" href="https://huggingface.co/CausalLM/preview-1-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CausalLM/preview-1-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CausalLM__preview-1-hf-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CausalLM/preview-1-hf
08e1e1ab428a591e74d849ff30bd8766474205bf
16.480167
0
9.543
true
false
false
true
1.278749
0.555893
55.589281
0.361457
10.100941
0.016616
1.661631
0.261745
1.565996
0.342188
1.106771
0.359707
28.856383
false
true
2025-01-26
0
Removed
Changgil_K2S3-14b-v0.2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Changgil/K2S3-14b-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Changgil/K2S3-14b-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Changgil__K2S3-14b-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Changgil/K2S3-14b-v0.2
b4f0e1eed2640df2b75847ff37e6ebb1be217b6c
15.187668
cc-by-nc-4.0
0
14.352
true
false
false
false
1.62463
0.324284
32.428401
0.461331
24.283947
0.052115
5.21148
0.28104
4.138702
0.39226
6.799219
0.264378
18.264258
false
false
2024-06-17
2024-06-27
0
Changgil/K2S3-14b-v0.2
Changgil_K2S3-v0.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Changgil/K2S3-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Changgil/K2S3-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Changgil__K2S3-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Changgil/K2S3-v0.1
d544e389f091983bb4f11314edb526d81753c919
14.839284
cc-by-nc-4.0
0
14.352
true
false
false
false
1.249882
0.327656
32.765617
0.465549
24.559558
0.046073
4.607251
0.264262
1.901566
0.401406
7.842448
0.256233
17.359264
false
false
2024-04-29
2024-06-27
0
Changgil/K2S3-v0.1
ClaudioItaly_Albacus_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/ClaudioItaly/Albacus" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ClaudioItaly/Albacus</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ClaudioItaly__Albacus-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ClaudioItaly/Albacus
a53faf62d0f99b67478ed9d262872c821a3ba83c
20.492986
mit
1
8.987
true
false
false
false
0.753939
0.466742
46.674158
0.511304
31.638865
0.070242
7.024169
0.271812
2.908277
0.413531
10.658073
0.316489
24.054374
true
false
2024-09-08
2024-09-08
1
ClaudioItaly/Albacus (Merge)
ClaudioItaly_Book-Gut12B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/ClaudioItaly/Book-Gut12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ClaudioItaly/Book-Gut12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ClaudioItaly__Book-Gut12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ClaudioItaly/Book-Gut12B
ae54351faca8170c93bf1de3a51bf16650f5bcf5
23.343746
mit
1
12.248
true
false
false
false
1.452248
0.399847
39.984685
0.541737
34.632193
0.098943
9.89426
0.307047
7.606264
0.463542
18.276042
0.367021
29.669031
true
false
2024-09-12
2024-09-17
1
ClaudioItaly/Book-Gut12B (Merge)
ClaudioItaly_Evolutionstory-7B-v2.2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/ClaudioItaly/Evolutionstory-7B-v2.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ClaudioItaly/Evolutionstory-7B-v2.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ClaudioItaly__Evolutionstory-7B-v2.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ClaudioItaly/Evolutionstory-7B-v2.2
9f838721d24a5195bed59a5ed8d9af536f7f2459
20.798247
mit
1
7.242
true
false
false
false
0.560232
0.481379
48.137941
0.510804
31.623865
0.070242
7.024169
0.275168
3.355705
0.413531
10.658073
0.315908
23.989731
true
false
2024-08-30
2024-09-01
1
ClaudioItaly/Evolutionstory-7B-v2.2 (Merge)
ClaudioItaly_intelligence-cod-rag-7b-v3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/ClaudioItaly/intelligence-cod-rag-7b-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ClaudioItaly/intelligence-cod-rag-7b-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ClaudioItaly__intelligence-cod-rag-7b-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ClaudioItaly/intelligence-cod-rag-7b-v3
2b21473c8a086f8d0c54b82c3454bf5499cdde3a
27.129011
mit
0
7.616
true
false
false
true
0.660472
0.689782
68.9782
0.536634
34.776159
0.098187
9.818731
0.272651
3.020134
0.415271
10.675521
0.419548
35.505319
true
false
2024-11-29
2024-12-02
1
ClaudioItaly/intelligence-cod-rag-7b-v3 (Merge)
CohereForAI_aya-23-35B_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
CohereForCausalLM
<a target="_blank" href="https://huggingface.co/CohereForAI/aya-23-35B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CohereForAI/aya-23-35B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CohereForAI__aya-23-35B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CohereForAI/aya-23-35B
31d6fd858f20539a55401c7ad913086f54d9ca2c
24.67988
cc-by-nc-4.0
268
34.981
true
false
false
true
16.985317
0.646193
64.619321
0.539955
34.85836
0.030211
3.021148
0.294463
5.928412
0.43099
13.473698
0.335605
26.178339
false
true
2024-05-19
2024-06-12
0
CohereForAI/aya-23-35B
CohereForAI_aya-23-8B_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
CohereForCausalLM
<a target="_blank" href="https://huggingface.co/CohereForAI/aya-23-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CohereForAI/aya-23-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CohereForAI__aya-23-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CohereForAI/aya-23-8B
ec151d218a24031eb039d92fb83d10445427efc9
15.998395
cc-by-nc-4.0
400
8.028
true
false
false
true
1.195172
0.469889
46.988878
0.429616
20.203761
0.015861
1.586103
0.284396
4.58613
0.394063
8.424479
0.227809
14.20102
false
true
2024-05-19
2024-06-12
0
CohereForAI/aya-23-8B
CohereForAI_aya-expanse-32b_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
CohereForCausalLM
<a target="_blank" href="https://huggingface.co/CohereForAI/aya-expanse-32b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CohereForAI/aya-expanse-32b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CohereForAI__aya-expanse-32b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CohereForAI/aya-expanse-32b
08b69cfa4240e2009c80ad304f000b491d1b8c38
29.391219
cc-by-nc-4.0
203
32.296
true
false
false
true
5.517735
0.730174
73.017372
0.564867
38.709611
0.133686
13.36858
0.325503
10.067114
0.387271
6.408854
0.412982
34.775783
false
true
2024-10-23
2024-10-24
0
CohereForAI/aya-expanse-32b
CohereForAI_aya-expanse-8b_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
CohereForCausalLM
<a target="_blank" href="https://huggingface.co/CohereForAI/aya-expanse-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CohereForAI/aya-expanse-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CohereForAI__aya-expanse-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CohereForAI/aya-expanse-8b
b9848575c8731981dfcf2e1f3bfbcb917a2e585d
22.142223
cc-by-nc-4.0
327
8.028
true
false
false
true
1.169689
0.635852
63.585176
0.49772
28.523483
0.070242
7.024169
0.302852
7.04698
0.372885
4.410677
0.300366
22.262855
false
true
2024-10-23
2024-10-24
0
CohereForAI/aya-expanse-8b
CohereForAI_c4ai-command-r-plus_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
CohereForCausalLM
<a target="_blank" href="https://huggingface.co/CohereForAI/c4ai-command-r-plus" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CohereForAI/c4ai-command-r-plus</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CohereForAI__c4ai-command-r-plus-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CohereForAI/c4ai-command-r-plus
fa1bd7fb1572ceb861bbbbecfa8af83b29fa8cca
30.961247
cc-by-nc-4.0
1,704
103.811
true
false
false
true
28.631532
0.766419
76.641866
0.581542
39.919954
0.081571
8.1571
0.305369
7.38255
0.480719
20.423177
0.399186
33.242834
false
true
2024-04-03
2024-06-13
0
CohereForAI/c4ai-command-r-plus
CohereForAI_c4ai-command-r-plus-08-2024_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
CohereForCausalLM
<a target="_blank" href="https://huggingface.co/CohereForAI/c4ai-command-r-plus-08-2024" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CohereForAI/c4ai-command-r-plus-08-2024</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CohereForAI__c4ai-command-r-plus-08-2024-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CohereForAI/c4ai-command-r-plus-08-2024
2d8cf3ab0af78b9e43546486b096f86adf3ba4d0
33.584534
cc-by-nc-4.0
223
103.811
true
false
false
true
22.318877
0.753954
75.395395
0.5996
42.836865
0.120091
12.009063
0.350671
13.422819
0.482948
19.835156
0.442071
38.007905
false
true
2024-08-21
2024-09-19
0
CohereForAI/c4ai-command-r-plus-08-2024
CohereForAI_c4ai-command-r-v01_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
CohereForCausalLM
<a target="_blank" href="https://huggingface.co/CohereForAI/c4ai-command-r-v01" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CohereForAI/c4ai-command-r-v01</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CohereForAI__c4ai-command-r-v01-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CohereForAI/c4ai-command-r-v01
16881ccde1c68bbc7041280e6a66637bc46bfe88
25.349978
cc-by-nc-4.0
1,073
34.981
true
false
false
true
13.395437
0.674819
67.481948
0.540642
34.556659
0
0
0.307047
7.606264
0.451698
16.128906
0.336935
26.326093
false
true
2024-03-11
2024-06-13
0
CohereForAI/c4ai-command-r-v01
CohereForAI_c4ai-command-r7b-12-2024_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Cohere2ForCausalLM
<a target="_blank" href="https://huggingface.co/CohereForAI/c4ai-command-r7b-12-2024" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CohereForAI/c4ai-command-r7b-12-2024</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CohereForAI__c4ai-command-r7b-12-2024-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CohereForAI/c4ai-command-r7b-12-2024
a9650f3bda8b0e00825ee36592e086b4ee621102
31.07624
cc-by-nc-4.0
351
8.028
true
false
false
true
2.454807
0.771315
77.131456
0.550264
36.024564
0.266616
26.661631
0.308725
7.829978
0.41251
10.230469
0.357214
28.579344
false
true
2024-12-11
2024-12-20
0
CohereForAI/c4ai-command-r7b-12-2024
Columbia-NLP_LION-Gemma-2b-dpo-v1.0_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/Columbia-NLP/LION-Gemma-2b-dpo-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Columbia-NLP/LION-Gemma-2b-dpo-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Columbia-NLP__LION-Gemma-2b-dpo-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Columbia-NLP/LION-Gemma-2b-dpo-v1.0
a5f780075831374f8850324448acf94976dea504
11.483995
0
2.506
false
false
false
true
0.979648
0.327831
32.783127
0.391996
14.585976
0.043051
4.305136
0.249161
0
0.41201
9.834635
0.166556
7.395095
false
false
2024-06-28
2024-07-04
0
Columbia-NLP/LION-Gemma-2b-dpo-v1.0
Columbia-NLP_LION-Gemma-2b-dpo-v1.0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/Columbia-NLP/LION-Gemma-2b-dpo-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Columbia-NLP/LION-Gemma-2b-dpo-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Columbia-NLP__LION-Gemma-2b-dpo-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Columbia-NLP/LION-Gemma-2b-dpo-v1.0
a5f780075831374f8850324448acf94976dea504
11.1488
0
2.506
false
false
false
true
0.994569
0.310246
31.02457
0.388103
14.243046
0.046828
4.682779
0.253356
0.447427
0.408073
9.109115
0.166473
7.38586
false
false
2024-06-28
2024-07-04
0
Columbia-NLP/LION-Gemma-2b-dpo-v1.0
Columbia-NLP_LION-Gemma-2b-odpo-v1.0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/Columbia-NLP/LION-Gemma-2b-odpo-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Columbia-NLP/LION-Gemma-2b-odpo-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Columbia-NLP__LION-Gemma-2b-odpo-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Columbia-NLP/LION-Gemma-2b-odpo-v1.0
090d9f59c3b47ab8dd099ddd278c058aa6d2d529
11.456795
4
2.506
false
false
false
true
0.962068
0.306649
30.664858
0.389584
14.023922
0.043051
4.305136
0.24245
0
0.427917
12.05625
0.169215
7.690603
false
false
2024-06-28
2024-07-13
0
Columbia-NLP/LION-Gemma-2b-odpo-v1.0
Columbia-NLP_LION-Gemma-2b-sft-v1.0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/Columbia-NLP/LION-Gemma-2b-sft-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Columbia-NLP/LION-Gemma-2b-sft-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Columbia-NLP__LION-Gemma-2b-sft-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Columbia-NLP/LION-Gemma-2b-sft-v1.0
44d6f26fa7e3b0d238064d844569bf8a07b7515e
12.489957
0
2.506
false
false
false
true
0.960809
0.369247
36.924693
0.387878
14.117171
0.061178
6.117825
0.255872
0.782998
0.40274
8.309115
0.178191
8.687943
false
false
2024-07-02
2024-07-04
0
Columbia-NLP/LION-Gemma-2b-sft-v1.0
Columbia-NLP_LION-LLaMA-3-8b-dpo-v1.0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Columbia-NLP/LION-LLaMA-3-8b-dpo-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Columbia-NLP/LION-LLaMA-3-8b-dpo-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Columbia-NLP__LION-LLaMA-3-8b-dpo-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Columbia-NLP/LION-LLaMA-3-8b-dpo-v1.0
3cddd4a6f5939a0a4db1092a0275342b7b9912f3
21.470701
2
8.03
false
false
false
true
0.696849
0.495742
49.574241
0.502848
30.356399
0.098187
9.818731
0.28104
4.138702
0.409719
10.28151
0.321892
24.654625
false
false
2024-06-28
2024-07-04
0
Columbia-NLP/LION-LLaMA-3-8b-dpo-v1.0
Columbia-NLP_LION-LLaMA-3-8b-odpo-v1.0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Columbia-NLP/LION-LLaMA-3-8b-odpo-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Columbia-NLP/LION-LLaMA-3-8b-odpo-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Columbia-NLP__LION-LLaMA-3-8b-odpo-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Columbia-NLP/LION-LLaMA-3-8b-odpo-v1.0
e2cec0d68a67092951e9205dfe634a59f2f4a2dd
19.462976
2
8.03
false
false
false
true
0.718697
0.396799
39.679938
0.502393
30.457173
0.083082
8.308157
0.285235
4.697987
0.40575
9.71875
0.315243
23.915854
false
false
2024-06-28
2024-07-04
0
Columbia-NLP/LION-LLaMA-3-8b-odpo-v1.0
Columbia-NLP_LION-LLaMA-3-8b-sft-v1.0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Columbia-NLP/LION-LLaMA-3-8b-sft-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Columbia-NLP/LION-LLaMA-3-8b-sft-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Columbia-NLP__LION-LLaMA-3-8b-sft-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Columbia-NLP/LION-LLaMA-3-8b-sft-v1.0
822eddb2fd127178d9fb7bb9f4fca0e93ada2836
20.459336
0
8.03
false
false
false
true
0.753613
0.381712
38.171164
0.508777
30.88426
0.096677
9.667674
0.277685
3.691275
0.450271
15.483854
0.32372
24.857787
false
false
2024-07-02
2024-07-04
0
Columbia-NLP/LION-LLaMA-3-8b-sft-v1.0
CombinHorizon_Josiefied-abliteratedV4-Qwen2.5-14B-Inst-BaseMerge-TIES_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CombinHorizon/Josiefied-abliteratedV4-Qwen2.5-14B-Inst-BaseMerge-TIES" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CombinHorizon/Josiefied-abliteratedV4-Qwen2.5-14B-Inst-BaseMerge-TIES</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CombinHorizon__Josiefied-abliteratedV4-Qwen2.5-14B-Inst-BaseMerge-TIES-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CombinHorizon/Josiefied-abliteratedV4-Qwen2.5-14B-Inst-BaseMerge-TIES
881729709fbf263b75e0f7341b66b5a880b82d11
32.903047
apache-2.0
2
14.77
true
false
false
true
1.665352
0.823996
82.399589
0.637009
48.19595
0
0
0.324664
9.955257
0.426031
12.653906
0.497922
44.213579
true
false
2024-12-07
2024-12-07
1
CombinHorizon/Josiefied-abliteratedV4-Qwen2.5-14B-Inst-BaseMerge-TIES (Merge)
CombinHorizon_Rombos-Qwen2.5-7B-Inst-BaseMerge-TIES_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CombinHorizon/Rombos-Qwen2.5-7B-Inst-BaseMerge-TIES" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CombinHorizon/Rombos-Qwen2.5-7B-Inst-BaseMerge-TIES</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CombinHorizon__Rombos-Qwen2.5-7B-Inst-BaseMerge-TIES-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CombinHorizon/Rombos-Qwen2.5-7B-Inst-BaseMerge-TIES
52d6f6308eba9c3a0b9116706fbb1ddc448e6101
27.14669
apache-2.0
1
7.616
true
false
false
true
1.045561
0.756402
75.64019
0.540209
34.95407
0
0
0.297819
6.375839
0.403302
8.779427
0.434176
37.130615
true
false
2024-10-29
2024-10-29
1
CombinHorizon/Rombos-Qwen2.5-7B-Inst-BaseMerge-TIES (Merge)
CombinHorizon_YiSM-blossom5.1-34B-SLERP_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/CombinHorizon/YiSM-blossom5.1-34B-SLERP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CombinHorizon/YiSM-blossom5.1-34B-SLERP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CombinHorizon__YiSM-blossom5.1-34B-SLERP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CombinHorizon/YiSM-blossom5.1-34B-SLERP
ebd8d6507623008567a0548cd0ff9e28cbd6a656
31.392518
apache-2.0
0
34.389
true
false
false
true
3.070814
0.503311
50.331121
0.620755
46.397613
0.216012
21.601208
0.355705
14.09396
0.441344
14.367969
0.474069
41.563239
true
false
2024-08-27
2024-08-27
1
CombinHorizon/YiSM-blossom5.1-34B-SLERP (Merge)
CombinHorizon_huihui-ai-abliterated-Qwen2.5-32B-Inst-BaseMerge-TIES_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CombinHorizon/huihui-ai-abliterated-Qwen2.5-32B-Inst-BaseMerge-TIES" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CombinHorizon/huihui-ai-abliterated-Qwen2.5-32B-Inst-BaseMerge-TIES</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CombinHorizon__huihui-ai-abliterated-Qwen2.5-32B-Inst-BaseMerge-TIES-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CombinHorizon/huihui-ai-abliterated-Qwen2.5-32B-Inst-BaseMerge-TIES
3284c32f13733d1cd17c723ed754f2c01b65a15c
35.750999
apache-2.0
1
32.764
true
false
false
true
13.000422
0.820624
82.062372
0.692925
56.044782
0
0
0.338926
11.856823
0.420729
12.091146
0.572058
52.450872
true
false
2024-12-07
2024-12-07
1
CombinHorizon/huihui-ai-abliterated-Qwen2.5-32B-Inst-BaseMerge-TIES (Merge)
CombinHorizon_huihui-ai-abliteratedV2-Qwen2.5-14B-Inst-BaseMerge-TIES_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CombinHorizon/huihui-ai-abliteratedV2-Qwen2.5-14B-Inst-BaseMerge-TIES" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CombinHorizon/huihui-ai-abliteratedV2-Qwen2.5-14B-Inst-BaseMerge-TIES</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CombinHorizon__huihui-ai-abliteratedV2-Qwen2.5-14B-Inst-BaseMerge-TIES-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CombinHorizon/huihui-ai-abliteratedV2-Qwen2.5-14B-Inst-BaseMerge-TIES
d92237b4b4deccb92a72b5209c79978f09fe3f08
32.339826
apache-2.0
2
14.77
true
false
false
true
1.66713
0.817576
81.757625
0.633589
47.767346
0
0
0.314597
8.612975
0.426031
12.453906
0.491024
43.447104
true
false
2024-12-07
2024-12-07
1
CombinHorizon/huihui-ai-abliteratedV2-Qwen2.5-14B-Inst-BaseMerge-TIES (Merge)
CombinHorizon_zetasepic-abliteratedV2-Qwen2.5-32B-Inst-BaseMerge-TIES_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CombinHorizon/zetasepic-abliteratedV2-Qwen2.5-32B-Inst-BaseMerge-TIES" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CombinHorizon/zetasepic-abliteratedV2-Qwen2.5-32B-Inst-BaseMerge-TIES</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CombinHorizon__zetasepic-abliteratedV2-Qwen2.5-32B-Inst-BaseMerge-TIES-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CombinHorizon/zetasepic-abliteratedV2-Qwen2.5-32B-Inst-BaseMerge-TIES
d976a5d6768d54c5e59a88fe63238a055c30c06a
37.007831
apache-2.0
8
32.764
true
false
false
true
3.683318
0.832814
83.28136
0.695517
56.827407
0
0
0.36745
15.659955
0.431396
14.224479
0.568484
52.053783
true
false
2024-12-07
2024-12-20
1
CombinHorizon/zetasepic-abliteratedV2-Qwen2.5-32B-Inst-BaseMerge-TIES (Merge)
ContactDoctor_Bio-Medical-3B-CoT-012025_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/ContactDoctor/Bio-Medical-3B-CoT-012025" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ContactDoctor/Bio-Medical-3B-CoT-012025</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ContactDoctor__Bio-Medical-3B-CoT-012025-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ContactDoctor/Bio-Medical-3B-CoT-012025
37e0ac4b64a82964af3b33324629324cbcbf7cda
16.99355
other
8
3.085
true
false
false
false
0.799845
0.360379
36.037935
0.438315
22.263528
0.117069
11.706949
0.30453
7.270694
0.33676
3.195052
0.293384
21.487145
false
false
2025-01-06
2025-01-15
2
Qwen/Qwen2.5-3B
ContactDoctor_Bio-Medical-Llama-3-8B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ContactDoctor/Bio-Medical-Llama-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ContactDoctor/Bio-Medical-Llama-3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ContactDoctor__Bio-Medical-Llama-3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ContactDoctor/Bio-Medical-Llama-3-8B
5436cda92c65b0ef520d278d864305c0f429824b
20.005569
other
46
4.015
true
false
false
false
0.617558
0.442237
44.22366
0.486312
26.195811
0.072508
7.250755
0.333893
11.185682
0.351396
1.757812
0.364777
29.419696
false
false
2024-08-09
2024-12-24
1
meta-llama/Meta-Llama-3-8B-Instruct
CoolSpring_Qwen2-0.5B-Abyme_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CoolSpring/Qwen2-0.5B-Abyme" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CoolSpring/Qwen2-0.5B-Abyme</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CoolSpring__Qwen2-0.5B-Abyme-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CoolSpring/Qwen2-0.5B-Abyme
a48b7c04b854e5c60fe3464f96904bfc53c8640c
4.798584
apache-2.0
0
0.494
true
false
false
true
1.177797
0.191519
19.15185
0.286183
2.276484
0.017372
1.73716
0.253356
0.447427
0.354219
1.477344
0.133311
3.701241
false
false
2024-07-18
2024-09-04
1
Qwen/Qwen2-0.5B
CoolSpring_Qwen2-0.5B-Abyme-merge2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CoolSpring/Qwen2-0.5B-Abyme-merge2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CoolSpring/Qwen2-0.5B-Abyme-merge2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CoolSpring__Qwen2-0.5B-Abyme-merge2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CoolSpring/Qwen2-0.5B-Abyme-merge2
02c4c601453f7ecbfab5c95bf5afa889350026ba
6.118848
apache-2.0
0
0.63
true
false
false
true
0.609695
0.202185
20.218465
0.299427
3.709041
0.021148
2.114804
0.260067
1.342282
0.368729
3.891146
0.148936
5.437352
true
false
2024-07-27
2024-07-27
1
CoolSpring/Qwen2-0.5B-Abyme-merge2 (Merge)
CoolSpring_Qwen2-0.5B-Abyme-merge3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CoolSpring/Qwen2-0.5B-Abyme-merge3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CoolSpring/Qwen2-0.5B-Abyme-merge3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CoolSpring__Qwen2-0.5B-Abyme-merge3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CoolSpring/Qwen2-0.5B-Abyme-merge3
86fed893893cc2a6240f0ea09ce2eeda1a5178cc
6.706903
apache-2.0
0
0.63
true
false
false
true
0.610171
0.238605
23.860468
0.300314
4.301149
0.024924
2.492447
0.264262
1.901566
0.350094
2.128385
0.150017
5.557402
true
false
2024-07-27
2024-07-27
1
CoolSpring/Qwen2-0.5B-Abyme-merge3 (Merge)
Corianas_Neural-Mistral-7B_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Corianas/Neural-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Corianas/Neural-Mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Corianas__Neural-Mistral-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Corianas/Neural-Mistral-7B
cde6f0126310f38b6781cc26cdb9a02416b896b9
18.200439
apache-2.0
0
7.242
true
false
false
true
0.461713
0.548924
54.892352
0.442802
22.431163
0.018882
1.888218
0.283557
4.474273
0.387271
6.208854
0.27377
19.307772
false
false
2024-03-05
2024-12-06
0
Corianas/Neural-Mistral-7B
Corianas_Quokka_2.7b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/Corianas/Quokka_2.7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Corianas/Quokka_2.7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Corianas__Quokka_2.7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Corianas/Quokka_2.7b
d9b3274662c2ac6c6058daac90504b5a8ebcac3c
4.919721
apache-2.0
0
2.786
true
false
false
false
0.293691
0.174907
17.490702
0.305547
3.165268
0.003776
0.377644
0.255872
0.782998
0.390833
6.0875
0.114528
1.614214
false
false
2023-03-30
2024-12-05
0
Corianas/Quokka_2.7b
Corianas_llama-3-reactor_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Corianas/llama-3-reactor" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Corianas/llama-3-reactor</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Corianas__llama-3-reactor-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Corianas/llama-3-reactor
bef2eac42fd89baa0064badbc9c7958ad9ccbed3
14.020646
apache-2.0
0
-1
true
false
false
false
0.821165
0.230012
23.001192
0.445715
21.88856
0.048338
4.833837
0.297819
6.375839
0.397719
8.014844
0.280086
20.009604
false
false
2024-07-20
2024-07-23
0
Corianas/llama-3-reactor
CortexLM_btlm-7b-base-v0.2_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/CortexLM/btlm-7b-base-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CortexLM/btlm-7b-base-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CortexLM__btlm-7b-base-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CortexLM/btlm-7b-base-v0.2
eda8b4298365a26c8981316e09427c237b11217f
8.869902
mit
1
6.885
true
false
false
false
0.711358
0.148329
14.832866
0.400641
16.193277
0.012085
1.208459
0.253356
0.447427
0.384604
5.542188
0.234957
14.995198
false
false
2024-06-13
2024-06-26
0
CortexLM/btlm-7b-base-v0.2
Cran-May_T.E-8.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Cran-May/T.E-8.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Cran-May/T.E-8.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Cran-May__T.E-8.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Cran-May/T.E-8.1
5f84709710dcce7cc05fa12473e8bb207fe25849
29.405457
cc-by-nc-sa-4.0
3
7.616
true
false
false
true
1.090633
0.707692
70.769226
0.558175
37.024377
0.067976
6.797583
0.312919
8.389262
0.450521
15.315104
0.443235
38.13719
false
false
2024-09-27
2024-09-29
1
Cran-May/T.E-8.1 (Merge)
CultriX_Qwen2.5-14B-Broca_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Broca" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Broca</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Broca-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-Broca
51204ee25a629abfd6d5e77a850b5e7a36c78462
37.723091
1
14.766
false
false
false
false
2.077001
0.560414
56.041415
0.652715
50.034412
0.345921
34.592145
0.386745
18.232662
0.476656
18.948698
0.536403
48.489214
false
false
2024-12-23
2024-12-23
1
CultriX/Qwen2.5-14B-Broca (Merge)
CultriX_Qwen2.5-14B-BrocaV9_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-BrocaV9" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-BrocaV9</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-BrocaV9-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-BrocaV9
883dafbff4edb8c83ef58a33413d4e09e922a53d
37.848878
2
14.766
false
false
false
false
1.774003
0.676293
67.629335
0.639138
48.053225
0.296828
29.682779
0.364094
15.212528
0.469031
18.395573
0.533078
48.119829
false
false
2025-01-02
2025-01-10
1
CultriX/Qwen2.5-14B-BrocaV9 (Merge)
CultriX_Qwen2.5-14B-Brocav3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Brocav3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Brocav3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Brocav3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-Brocav3
6f3fe686a79dcbcd5835ca100e194c49f493167b
38.764254
2
14.766
false
false
false
false
1.816739
0.695178
69.517768
0.645235
49.049112
0.322508
32.250755
0.35906
14.541387
0.475635
19.254427
0.531749
47.972074
false
false
2024-12-23
2024-12-23
1
CultriX/Qwen2.5-14B-Brocav3 (Merge)
CultriX_Qwen2.5-14B-Brocav6_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Brocav6" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Brocav6</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Brocav6-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-Brocav6
bd981505b6950df69216b260c3c0d86124fded7b
38.317568
2
14.766
false
false
false
false
1.791401
0.699524
69.952393
0.638884
47.819225
0.296073
29.607251
0.36745
15.659955
0.474208
18.876042
0.531915
47.990544
false
false
2024-12-23
2024-12-23
1
CultriX/Qwen2.5-14B-Brocav6 (Merge)
CultriX_Qwen2.5-14B-Brocav7_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Brocav7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Brocav7</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Brocav7-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-Brocav7
06acee7f6e9796081ced6201001784907c77f96f
38.522214
1
14.766
false
false
false
false
1.701349
0.672372
67.237153
0.644403
48.905361
0.318731
31.873112
0.36745
15.659955
0.479604
20.150521
0.525765
47.307181
false
false
2024-12-23
2024-12-23
1
CultriX/Qwen2.5-14B-Brocav7 (Merge)
CultriX_Qwen2.5-14B-Emerged_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Emerged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Emerged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Emerged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-Emerged
8bf0e31b23ee22858bbde2cee44dde88963f5084
37.662617
1
14.766
false
false
false
false
1.80736
0.700024
70.002371
0.626003
45.932419
0.307402
30.740181
0.357383
14.317673
0.469094
18.470052
0.518617
46.513002
false
false
2024-12-19
2024-12-19
1
CultriX/Qwen2.5-14B-Emerged (Merge)
CultriX_Qwen2.5-14B-Emergedv3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Emergedv3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Emergedv3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Emergedv3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-Emergedv3
f4df1b9c2bf37bbfd6b2e8f2ff244c6029a5d546
34.842092
1
14.766
false
false
false
false
1.918928
0.638849
63.884936
0.619073
44.731608
0.206949
20.694864
0.360738
14.765101
0.472813
18.601563
0.51737
46.374483
false
false
2024-12-21
2024-12-21
1
CultriX/Qwen2.5-14B-Emergedv3 (Merge)
CultriX_Qwen2.5-14B-FinalMerge_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-FinalMerge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-FinalMerge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-FinalMerge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-FinalMerge
8fd624d0d8989a312d344772814da3575423897a
28.057015
1
14.766
false
false
false
false
1.943941
0.489098
48.909782
0.571495
38.162479
0.130665
13.066465
0.354866
13.982103
0.437906
14.504948
0.457447
39.716312
false
false
2024-12-22
2024-12-23
1
CultriX/Qwen2.5-14B-FinalMerge (Merge)
CultriX_Qwen2.5-14B-Hyper_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Hyper" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Hyper</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Hyper-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-Hyper
a6399c43f84736ed1b11d8cc7a25edf634781207
37.535349
1
14.766
false
false
false
false
5.778149
0.539132
53.913173
0.650745
49.759879
0.33006
33.006042
0.391779
18.903803
0.489833
21.029167
0.5374
48.60003
false
false
2025-01-19
2025-01-19
1
CultriX/Qwen2.5-14B-Hyper (Merge)
CultriX_Qwen2.5-14B-Hyperionv3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Hyperionv3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Hyperionv3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Hyperionv3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-Hyperionv3
bc36be5b5ca3053ae96d85e962249efd0b283c82
39.283772
3
14.766
false
false
false
false
1.982855
0.683637
68.363719
0.652217
49.950055
0.34139
34.138973
0.370805
16.107383
0.472969
18.921094
0.533993
48.22141
false
false
2025-01-10
2025-01-19
1
CultriX/Qwen2.5-14B-Hyperionv3 (Merge)
CultriX_Qwen2.5-14B-Hyperionv4_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Hyperionv4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Hyperionv4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Hyperionv4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-Hyperionv4
60cc366b0648bcb40ed22ebc53d64cc5aca25550
37.380492
2
14.766
false
false
false
false
2.036807
0.54158
54.157968
0.647179
49.07652
0.33006
33.006042
0.397651
19.686801
0.483198
19.866406
0.536403
48.489214
false
false
2025-01-19
2025-01-19
1
CultriX/Qwen2.5-14B-Hyperionv4 (Merge)
CultriX_Qwen2.5-14B-Hyperionv5_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Hyperionv5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Hyperionv5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Hyperionv5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-Hyperionv5
e0f4941349664a75ddd03e4d2c190284c951e54b
38.277336
2
14.766
false
false
false
false
1.986734
0.672921
67.292118
0.644266
48.94828
0.295317
29.531722
0.371644
16.219239
0.479542
19.876042
0.53017
47.796616
false
false
2025-01-19
2025-01-19
1
CultriX/Qwen2.5-14B-Hyperionv5 (Merge)
CultriX_Qwen2.5-14B-MegaMerge-pt2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-MegaMerge-pt2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-MegaMerge-pt2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-MegaMerge-pt2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-MegaMerge-pt2
20397f6cafc09c2cb74f105867cd99b3c68c71dc
36.694314
apache-2.0
2
14.766
true
false
false
false
2.250434
0.568308
56.830765
0.65777
50.907903
0.273414
27.34139
0.379195
17.225951
0.472875
18.742708
0.542055
49.117169
true
false
2024-10-24
2024-10-25
1
CultriX/Qwen2.5-14B-MegaMerge-pt2 (Merge)
CultriX_Qwen2.5-14B-MergeStock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-MergeStock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-MergeStock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-MergeStock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-MergeStock
fa00543296f2731793dfb0aac667571ccf1abb5b
36.390259
apache-2.0
2
14.766
true
false
false
false
4.430606
0.568533
56.85326
0.657934
51.009391
0.273414
27.34139
0.373322
16.442953
0.467635
17.854427
0.539561
48.84013
true
false
2024-10-23
2024-10-24
1
CultriX/Qwen2.5-14B-MergeStock (Merge)
CultriX_Qwen2.5-14B-Unity_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Unity" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Unity</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Unity-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-Unity
1d15e7941e6ceff5d6e4f293378947bee721a24d
33.666802
3
14.766
false
false
false
false
1.913689
0.673895
67.389526
0.601996
42.258617
0.153323
15.332326
0.347315
12.975391
0.467948
18.760156
0.507563
45.284796
false
false
2024-12-21
2024-12-21
1
CultriX/Qwen2.5-14B-Unity (Merge)
CultriX_Qwen2.5-14B-Wernicke_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Wernicke" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Wernicke</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Wernicke-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-Wernicke
622c0a58ecb0c0c679d7381a823d2ae5ac2b8ce1
36.999242
apache-2.0
5
14.77
true
false
false
false
2.222234
0.52347
52.346995
0.656836
50.642876
0.324773
32.477341
0.393456
19.127517
0.468906
18.246615
0.542387
49.154108
true
false
2024-10-21
2024-10-22
1
CultriX/Qwen2.5-14B-Wernicke (Merge)
CultriX_Qwen2.5-14B-Wernicke-SFT_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Wernicke-SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Wernicke-SFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Wernicke-SFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-Wernicke-SFT
3b68dfba2cf79e4a15e8f4271f7d4b62d2ab9f26
33.524336
apache-2.0
2
14.77
true
false
false
true
1.393013
0.493744
49.374438
0.646059
49.330572
0.358006
35.800604
0.354027
13.870246
0.39
7.55
0.506981
45.220154
true
false
2024-11-16
2024-11-17
1
CultriX/Qwen2.5-14B-Wernicke-SFT (Merge)
CultriX_Qwen2.5-14B-Wernicke-SLERP_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Wernicke-SLERP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Wernicke-SLERP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Wernicke-SLERP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-Wernicke-SLERP
180175561e8061be067fc349ad4491270f19976f
30.639825
0
14.491
false
false
false
true
2.155988
0.55889
55.889041
0.644093
49.372327
0.094411
9.441088
0.34396
12.527964
0.414031
11.120573
0.509392
45.487958
false
false
2024-10-25
0
Removed
CultriX_Qwen2.5-14B-Wernickev3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Wernickev3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Wernickev3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Wernickev3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-Wernickev3
bd141b0df78ad1f6e2938edf167c2305b395a2b2
37.940558
1
14.766
false
false
false
false
1.915634
0.70482
70.481988
0.618415
44.576275
0.327795
32.779456
0.362416
14.988814
0.471667
18.691667
0.515126
46.125148
false
false
2024-12-19
2024-12-19
1
CultriX/Qwen2.5-14B-Wernickev3 (Merge)
CultriX_Qwen2.5-14B-partialmergept1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-partialmergept1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-partialmergept1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-partialmergept1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-partialmergept1
02c6491a2affea23c1e5d89d324a90d24a0e5381
34.853934
0
14.766
false
false
false
false
2.009336
0.633729
63.372851
0.615118
44.594404
0.19864
19.864048
0.361577
14.876957
0.475698
19.66224
0.520778
46.753103
false
false
2025-01-02
2025-01-19
1
CultriX/Qwen2.5-14B-partialmergept1 (Merge)
CultriX_Qwenfinity-2.5-14B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwenfinity-2.5-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwenfinity-2.5-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwenfinity-2.5-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwenfinity-2.5-14B
6acc1308274031b045f028b0a0290cdbe4243a04
27.96652
0
14.766
false
false
false
false
1.977067
0.481379
48.137941
0.565501
37.259942
0.148792
14.879154
0.348993
13.199105
0.450583
15.45625
0.449801
38.866726
false
false
2024-12-21
2024-12-23
1
CultriX/Qwenfinity-2.5-14B (Merge)
CultriX_Qwestion-14B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwestion-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwestion-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwestion-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwestion-14B
e286bfafbc28e36859202c9f06ed8287a4f1d8b6
37.630294
apache-2.0
1
14.766
true
false
false
false
1.853821
0.63178
63.178034
0.64501
48.757034
0.317221
31.722054
0.368289
15.771812
0.463604
17.217188
0.542221
49.135638
true
false
2024-11-21
2024-11-23
1
CultriX/Qwestion-14B (Merge)
CultriX_SeQwence-14B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/SeQwence-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/SeQwence-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__SeQwence-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/SeQwence-14B
f4a147b717ba0e9392f96e343250b00239196a22
36.659687
apache-2.0
3
14.766
true
false
false
false
1.796383
0.53516
53.516004
0.650567
50.163578
0.339879
33.987915
0.360738
14.765101
0.466615
18.426823
0.541888
49.0987
false
false
2024-11-20
2024-11-20
0
CultriX/SeQwence-14B
CultriX_SeQwence-14B-EvolMerge_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/SeQwence-14B-EvolMerge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/SeQwence-14B-EvolMerge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__SeQwence-14B-EvolMerge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/SeQwence-14B-EvolMerge
a98c932f0d71d76883fe9aa9d708af0506b01343
37.200413
apache-2.0
2
14.766
true
false
false
false
1.950826
0.538158
53.815764
0.657218
50.780351
0.317976
31.797583
0.380872
17.449664
0.482083
20.260417
0.541888
49.0987
true
false
2024-11-27
2024-11-27
1
CultriX/SeQwence-14B-EvolMerge (Merge)
CultriX_SeQwence-14B-EvolMergev1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/SeQwence-14B-EvolMergev1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/SeQwence-14B-EvolMergev1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__SeQwence-14B-EvolMergev1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/SeQwence-14B-EvolMergev1
6cc7116cdea757635dba52bb82a306654d118e77
36.852183
2
14.766
false
false
false
false
1.957896
0.555468
55.546838
0.654555
50.302259
0.324773
32.477341
0.376678
16.89038
0.462271
17.083854
0.539312
48.812426
false
false
2024-11-25
2024-11-27
1
CultriX/SeQwence-14B-EvolMergev1 (Merge)
CultriX_SeQwence-14B-v5_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/SeQwence-14B-v5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/SeQwence-14B-v5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__SeQwence-14B-v5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/SeQwence-14B-v5
9f43ad41542be56f6a18f31bfa60086318735ed5
37.268663
0
14.766
false
false
false
false
1.86516
0.591988
59.198815
0.651709
49.995731
0.310423
31.042296
0.369966
15.995526
0.471417
18.327083
0.541473
49.052527
false
false
2024-11-18
0
Removed
CultriX_SeQwence-14Bv1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/SeQwence-14Bv1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/SeQwence-14Bv1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__SeQwence-14Bv1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/SeQwence-14Bv1
542bfbd2e6fb25ecd11b84d956764eb23233a034
38.197632
apache-2.0
2
14.766
true
false
false
false
1.830191
0.6678
66.780033
0.634467
47.190898
0.335347
33.534743
0.361577
14.876957
0.470427
18.803385
0.531998
47.999778
true
false
2024-11-24
2024-11-27
1
CultriX/SeQwence-14Bv1 (Merge)
CultriX_SeQwence-14Bv2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/SeQwence-14Bv2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/SeQwence-14Bv2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__SeQwence-14Bv2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/SeQwence-14Bv2
674c6d49b604fdf26e327e1e86c4fde0724b98e8
34.409763
0
14.766
false
false
false
false
1.974894
0.578599
57.859923
0.630451
46.529224
0.216012
21.601208
0.360738
14.765101
0.460104
17.546354
0.533411
48.156767
false
false
2024-11-27
2024-12-08
1
CultriX/SeQwence-14Bv2 (Merge)
CultriX_SeQwence-14Bv3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/SeQwence-14Bv3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/SeQwence-14Bv3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__SeQwence-14Bv3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/SeQwence-14Bv3
b3f2b5273bbc996814a25aa9060fd6f4c0d93bca
34.411032
2
14.766
false
false
false
false
1.965075
0.571905
57.190477
0.630225
46.385368
0.221299
22.129909
0.364933
15.324385
0.462427
17.270052
0.533494
48.166002
false
false
2024-11-27
2024-11-27
1
CultriX/SeQwence-14Bv3 (Merge)
DRXD1000_Atlas-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DRXD1000/Atlas-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DRXD1000/Atlas-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DRXD1000__Atlas-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DRXD1000/Atlas-7B
967ee983e2a0b163c12da69f1f81aaf8ffb2a456
8.509639
apache-2.0
0
7.768
true
false
false
true
1.256758
0.370446
37.044597
0.330218
7.540208
0.002266
0.226586
0.25755
1.006711
0.33425
0.78125
0.140126
4.458481
false
false
2024-12-10
2024-12-10
0
DRXD1000/Atlas-7B
DRXD1000_Phoenix-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/DRXD1000/Phoenix-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DRXD1000/Phoenix-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DRXD1000__Phoenix-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DRXD1000/Phoenix-7B
a5caa8036d8b7819eb723debe3f037471b5c4882
12.143216
apache-2.0
17
7.242
true
false
false
true
0.470872
0.320962
32.096171
0.393157
15.62018
0
0
0.278523
3.803132
0.384948
6.41849
0.234292
14.921321
false
false
2024-01-10
2024-12-11
0
DRXD1000/Phoenix-7B
DUAL-GPO_zephyr-7b-ipo-0k-15k-i1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Adapter
?
<a target="_blank" href="https://huggingface.co/DUAL-GPO/zephyr-7b-ipo-0k-15k-i1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DUAL-GPO/zephyr-7b-ipo-0k-15k-i1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DUAL-GPO__zephyr-7b-ipo-0k-15k-i1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DUAL-GPO/zephyr-7b-ipo-0k-15k-i1
564d269c67dfcc5c07a4fbc270a6a48da1929d30
15.492948
0
14.483
false
false
false
false
0.971423
0.275624
27.562423
0.447271
22.658643
0.030211
3.021148
0.291107
5.480984
0.417344
10.567969
0.312999
23.666519
false
false
2024-09-20
2024-09-22
1
DUAL-GPO/zephyr-7b-ipo-qlora-v0-merged
DZgas_GIGABATEMAN-7B_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/DZgas/GIGABATEMAN-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DZgas/GIGABATEMAN-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DZgas__GIGABATEMAN-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DZgas/GIGABATEMAN-7B
edf2840350e7fd55895d9df560b489ac10ecb95e
20.446293
6
7.242
false
false
false
false
0.630337
0.460746
46.074638
0.503218
29.827517
0.053625
5.362538
0.28943
5.257271
0.432844
11.972135
0.317653
24.183658
false
false
2024-04-17
2024-09-15
1
DZgas/GIGABATEMAN-7B (Merge)
Daemontatox_AetherDrake-SFT_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/AetherDrake-SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/AetherDrake-SFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__AetherDrake-SFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/AetherDrake-SFT
17a0f90f0c06f2adc885faccd0a6172a7b996126
22.827028
apache-2.0
2
8.03
true
false
false
false
1.449313
0.480355
48.035546
0.487201
27.139252
0.146526
14.652568
0.32047
9.395973
0.408844
9.972135
0.3499
27.766696
false
false
2024-12-24
2024-12-25
1
Daemontatox/AetherDrake-SFT (Merge)
Daemontatox_AetherSett_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/AetherSett" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/AetherSett</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__AetherSett-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/AetherSett
d8d86c6dc1b693192931b02e39290eca331ae84e
29.922137
apache-2.0
1
7.616
true
false
false
false
1.306856
0.536959
53.69586
0.545162
34.744146
0.307402
30.740181
0.307886
7.718121
0.460312
16.205729
0.427859
36.428783
false
false
2024-12-30
2024-12-30
3
Qwen/Qwen2.5-7B
Daemontatox_AetherTOT_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MllamaForConditionalGeneration
<a target="_blank" href="https://huggingface.co/Daemontatox/AetherTOT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/AetherTOT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__AetherTOT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/AetherTOT
71d99f8fb69276422daae61222e57087000c05b0
23.191413
apache-2.0
0
10.67
true
false
false
false
0.698923
0.439764
43.976427
0.506606
29.436391
0.149547
14.954683
0.323826
9.8434
0.407854
9.781771
0.380402
31.155807
false
false
2024-12-27
2024-12-28
2
meta-llama/Llama-3.2-11B-Vision-Instruct
Daemontatox_AetherTOT_bfloat16
bfloat16
🌸 multimodal
🌸
Original
MllamaForConditionalGeneration
<a target="_blank" href="https://huggingface.co/Daemontatox/AetherTOT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/AetherTOT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__AetherTOT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/AetherTOT
71d99f8fb69276422daae61222e57087000c05b0
22.874708
apache-2.0
0
10.67
true
false
false
false
0.708698
0.43829
43.82904
0.503431
29.031857
0.14426
14.425982
0.323826
9.8434
0.405188
9.248438
0.377826
30.869533
false
false
2024-12-27
2024-12-28
2
meta-llama/Llama-3.2-11B-Vision-Instruct
Daemontatox_AetherUncensored_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/AetherUncensored" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/AetherUncensored</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__AetherUncensored-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/AetherUncensored
e498d645faab591062c6919a98b35656e2d0c783
17.090876
apache-2.0
1
8.03
true
false
false
false
0.739253
0.404193
40.41931
0.446313
21.678618
0.067976
6.797583
0.288591
5.145414
0.374677
9.501302
0.271027
19.003029
false
false
2025-01-09
2025-01-09
2
cognitivecomputations/Dolphin3.0-Llama3.1-8B (Merge)
Daemontatox_CogitoDistil_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/CogitoDistil" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/CogitoDistil</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__CogitoDistil-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/CogitoDistil
f9a5302a0c4b464c44d79f745b8498ab51dd97de
12.384402
apache-2.0
0
7.616
true
false
false
true
0.81454
0.277648
27.764775
0.367677
11.948759
0.104985
10.498489
0.259228
1.230425
0.37549
4.802865
0.26255
18.061096
false
false
2025-01-22
2025-01-22
2
deepseek-ai/DeepSeek-R1-Distill-Qwen-7B