eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
6 values
T
stringclasses
6 values
Weight type
stringclasses
2 values
Architecture
stringclasses
52 values
Model
stringlengths
355
689
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
1.03
52
Hub License
stringclasses
26 values
Hub ❤️
int64
0
5.9k
#Params (B)
int64
-1
140
Available on the hub
bool
2 classes
MoE
bool
2 classes
Flagged
bool
2 classes
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.03
107
IFEval Raw
float64
0
0.9
IFEval
float64
0
90
BBH Raw
float64
0.27
0.75
BBH
float64
0.81
63.5
MATH Lvl 5 Raw
float64
0
0.51
MATH Lvl 5
float64
0
50.7
GPQA Raw
float64
0.22
0.44
GPQA
float64
0
24.9
MUSR Raw
float64
0.29
0.6
MUSR
float64
0
38.5
MMLU-PRO Raw
float64
0.1
0.73
MMLU-PRO
float64
0
70
Merged
bool
2 classes
Official Providers
bool
2 classes
Upload To Hub Date
stringclasses
424 values
Submission Date
stringclasses
169 values
Generation
int64
0
10
Base Model
stringlengths
4
102
T145_ZEUS-8B-V4_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/ZEUS-8B-V4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/ZEUS-8B-V4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__ZEUS-8B-V4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/ZEUS-8B-V4
ca89fdfe275397f430092a0f644dc02b22ba2a8b
29.629446
0
8
false
false
false
true
0.650576
0.780732
78.073179
0.524597
32.046144
0.191088
19.108761
0.307047
7.606264
0.402896
9.961979
0.378823
30.980349
false
false
2024-12-06
0
Removed
T145_qwen-2.5-3B-merge-test_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/T145/qwen-2.5-3B-merge-test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/qwen-2.5-3B-merge-test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__qwen-2.5-3B-merge-test-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/qwen-2.5-3B-merge-test
0d5f82d841f811fbf1ee07bfbf7c6eb1de812840
21.154151
0
3
false
false
false
true
0.783957
0.575102
57.510184
0.484249
27.889341
0.030967
3.096677
0.285235
4.697987
0.400729
8.291146
0.328956
25.439569
false
false
2024-11-16
0
Removed
THUDM_glm-4-9b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
ChatGLMModelM
<a target="_blank" href="https://huggingface.co/THUDM/glm-4-9b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">THUDM/glm-4-9b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/THUDM__glm-4-9b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
THUDM/glm-4-9b
99a140996f9d4f197842fb6b1aab217a42e27ef3
18.006732
other
112
9
true
false
false
false
1.672447
0.142608
14.260828
0.552837
35.811284
0
0
0.316275
8.836689
0.438583
14.189583
0.414478
34.942007
false
false
2024-06-04
2024-07-04
0
THUDM/glm-4-9b
THUDM_glm-4-9b-chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
ChatGLMModelM
<a target="_blank" href="https://huggingface.co/THUDM/glm-4-9b-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">THUDM/glm-4-9b-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/THUDM__glm-4-9b-chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
THUDM/glm-4-9b-chat
04419001bc63e05e70991ade6da1f91c4aeec278
10.973477
other
637
9
true
false
false
true
0.247135
0
0
0.473639
25.205184
0
0
0.313758
8.501119
0.399427
8.061719
0.316656
24.072843
false
false
2024-06-04
2024-07-09
0
THUDM/glm-4-9b-chat
THUDM_glm-4-9b-chat-1m_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
ChatGLMModel
<a target="_blank" href="https://huggingface.co/THUDM/glm-4-9b-chat-1m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">THUDM/glm-4-9b-chat-1m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/THUDM__glm-4-9b-chat-1m-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
THUDM/glm-4-9b-chat-1m
0aa722c7e0745dd21453427dd44c257dd253304f
8.92251
other
181
9
true
false
false
true
0.20567
0
0
0.418006
17.108029
0
0
0.303691
7.158837
0.379458
5.232292
0.316323
24.035904
false
false
2024-06-04
2024-10-09
0
THUDM/glm-4-9b-chat-1m
TIGER-Lab_MAmmoTH2-7B-Plus_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/TIGER-Lab/MAmmoTH2-7B-Plus" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TIGER-Lab/MAmmoTH2-7B-Plus</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TIGER-Lab__MAmmoTH2-7B-Plus-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TIGER-Lab/MAmmoTH2-7B-Plus
3ed578d8dda09787137e363a0dc32e3a8ed908de
21.469862
mit
6
7
true
false
false
true
0.552663
0.557466
55.746641
0.423469
18.925953
0.175982
17.598187
0.280201
4.026846
0.412354
10.110938
0.301695
22.410609
false
false
2024-05-06
2024-06-27
0
TIGER-Lab/MAmmoTH2-7B-Plus
TTTXXX01_Mistral-7B-Base-SimPO2-5e-7_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/TTTXXX01/Mistral-7B-Base-SimPO2-5e-7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TTTXXX01/Mistral-7B-Base-SimPO2-5e-7</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TTTXXX01__Mistral-7B-Base-SimPO2-5e-7-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TTTXXX01/Mistral-7B-Base-SimPO2-5e-7
7a271e3061165f4e1abfe26715c04e20c2ac935e
16.379688
apache-2.0
0
7
true
false
false
true
0.522996
0.439189
43.918913
0.431955
20.692627
0.024169
2.416918
0.297819
6.375839
0.360417
5.252083
0.276596
19.621749
false
false
2024-08-30
2024-09-01
2
mistralai/Mistral-7B-v0.1
TeeZee_DoubleBagel-57B-v1.0_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/TeeZee/DoubleBagel-57B-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TeeZee/DoubleBagel-57B-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TeeZee__DoubleBagel-57B-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TeeZee/DoubleBagel-57B-v1.0
6e10dc1fb5223d1b045dc2a19c9c267a574e520f
8.544103
apache-2.0
1
56
true
false
false
true
9.368647
0.233633
23.363343
0.325079
5.522782
0
0
0.276007
3.467562
0.43149
13.602865
0.147773
5.308067
true
false
2024-08-05
2024-08-10
1
TeeZee/DoubleBagel-57B-v1.0 (Merge)
TencentARC_LLaMA-Pro-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/TencentARC/LLaMA-Pro-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TencentARC/LLaMA-Pro-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TencentARC__LLaMA-Pro-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TencentARC/LLaMA-Pro-8B
7115e7179060e0623d1ee9ff4476faed7e478d8c
8.778934
llama2
171
8
true
false
false
false
47.807734
0.227714
22.771358
0.34842
9.29395
0.016616
1.661631
0.260067
1.342282
0.401812
8.593229
0.1811
9.011155
false
true
2024-01-05
2024-06-12
0
TencentARC/LLaMA-Pro-8B
TencentARC_LLaMA-Pro-8B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/TencentARC/LLaMA-Pro-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TencentARC/LLaMA-Pro-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TencentARC__LLaMA-Pro-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TencentARC/LLaMA-Pro-8B-Instruct
9850c8afce19a69d8fc4a1603a82441157514016
15.144991
llama2
63
8
true
false
false
true
3.105203
0.448606
44.860636
0.422421
19.485726
0.016616
1.661631
0.274329
3.243848
0.419021
11.110938
0.194564
10.507166
false
true
2024-01-06
2024-06-12
0
TencentARC/LLaMA-Pro-8B-Instruct
TencentARC_MetaMath-Mistral-Pro_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/TencentARC/MetaMath-Mistral-Pro" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TencentARC/MetaMath-Mistral-Pro</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TencentARC__MetaMath-Mistral-Pro-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TencentARC/MetaMath-Mistral-Pro
3835d38de15ed2a04c32aca879b782fc50e390bf
12.013002
apache-2.0
5
8
true
false
false
false
0.600752
0.211877
21.187671
0.441316
22.372279
0.046073
4.607251
0.269295
2.572707
0.352417
4.985417
0.247174
16.352689
false
true
2024-02-26
2024-06-12
0
TencentARC/MetaMath-Mistral-Pro
TencentARC_Mistral_Pro_8B_v0.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/TencentARC/Mistral_Pro_8B_v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TencentARC/Mistral_Pro_8B_v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TencentARC__Mistral_Pro_8B_v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TencentARC/Mistral_Pro_8B_v0.1
366f159fc5b314ba2a955209d2bca4600f84dac0
14.195346
apache-2.0
66
8
true
false
false
false
0.632482
0.211452
21.145228
0.452598
22.894189
0.056647
5.664653
0.280201
4.026846
0.424229
11.828646
0.276513
19.612515
false
true
2024-02-22
2024-06-12
0
TencentARC/Mistral_Pro_8B_v0.1
TheDrummer_Cydonia-22B-v1.2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/TheDrummer/Cydonia-22B-v1.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TheDrummer/Cydonia-22B-v1.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TheDrummer__Cydonia-22B-v1.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TheDrummer/Cydonia-22B-v1.2
acd8da5efadc7dc404bb4eeebef2b27b1554a2ca
28.399857
other
26
22
true
false
false
false
1.628704
0.563511
56.351148
0.580856
39.932604
0.179758
17.975831
0.330537
10.738255
0.402177
10.505469
0.414063
34.895833
false
false
2024-10-07
2024-10-26
0
TheDrummer/Cydonia-22B-v1.2
TheDrummer_Gemmasutra-9B-v1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/TheDrummer/Gemmasutra-9B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TheDrummer/Gemmasutra-9B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TheDrummer__Gemmasutra-9B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TheDrummer/Gemmasutra-9B-v1
21591f6a0140e095f1c6668ac7a267f214547609
22.736097
23
10
false
false
false
false
2.903819
0.241551
24.155131
0.588691
41.200396
0.082326
8.232628
0.310403
8.053691
0.484594
20.940885
0.404505
33.83385
false
false
2024-07-17
2024-09-19
1
TheDrummer/Gemmasutra-9B-v1 (Merge)
TheDrummer_Gemmasutra-Mini-2B-v1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/TheDrummer/Gemmasutra-Mini-2B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TheDrummer/Gemmasutra-Mini-2B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TheDrummer__Gemmasutra-Mini-2B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TheDrummer/Gemmasutra-Mini-2B-v1
c1db4c8f975d3848edbdaf851217039c8dfdaeb5
9.028588
other
47
2
true
false
false
true
1.397955
0.254866
25.486598
0.357502
9.810336
0.031722
3.172205
0.270973
2.796421
0.348979
1.189062
0.205452
11.716903
false
false
2024-08-03
2024-10-28
0
TheDrummer/Gemmasutra-Mini-2B-v1
TheDrummer_Ministrations-8B-v1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/TheDrummer/Ministrations-8B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TheDrummer/Ministrations-8B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TheDrummer__Ministrations-8B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TheDrummer/Ministrations-8B-v1
39b892de64401ec7990ebb816c4455ba4532bafb
21.151983
other
14
8
true
false
false
false
0.862556
0.282193
28.219347
0.487663
26.985637
0.175982
17.598187
0.324664
9.955257
0.444906
14.779948
0.364362
29.373522
false
false
2024-11-07
2024-11-14
0
TheDrummer/Ministrations-8B-v1
TheDrummer_Rocinante-12B-v1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/TheDrummer/Rocinante-12B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TheDrummer/Rocinante-12B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TheDrummer__Rocinante-12B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TheDrummer/Rocinante-12B-v1
74a4ae2584d45655298995198d5ab3e660364a1a
23.608456
other
26
12
true
false
false
true
1.864442
0.60765
60.764992
0.506545
30.025654
0.06571
6.570997
0.291107
5.480984
0.401719
11.28151
0.347739
27.526596
false
false
2024-08-14
2024-09-03
0
TheDrummer/Rocinante-12B-v1
TheHierophant_Underground-Cognitive-V0.3-test_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/TheHierophant/Underground-Cognitive-V0.3-test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TheHierophant/Underground-Cognitive-V0.3-test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TheHierophant__Underground-Cognitive-V0.3-test-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TheHierophant/Underground-Cognitive-V0.3-test
2753b6f9068ad14efe836cde3160747cd208bf9e
21.524923
0
10
false
false
false
false
0.586581
0.48083
48.082975
0.529013
33.665102
0.006042
0.60423
0.298658
6.487696
0.435115
14.55599
0.331782
25.753546
false
false
2024-11-22
2024-11-22
1
TheHierophant/Underground-Cognitive-V0.3-test (Merge)
TheTsar1209_nemo-carpmuscle-v0.1_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/TheTsar1209/nemo-carpmuscle-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TheTsar1209/nemo-carpmuscle-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TheTsar1209__nemo-carpmuscle-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TheTsar1209/nemo-carpmuscle-v0.1
84d20db8220014958ff157047b2216910637ae39
16.706372
apache-2.0
1
12
true
false
false
false
1.80844
0.227564
22.756397
0.508353
30.034996
0.042296
4.229607
0.29698
6.263982
0.4135
10.220833
0.340592
26.732417
false
false
2024-08-15
2024-10-10
1
unsloth/Mistral-Nemo-Base-2407-bnb-4bit
TheTsar1209_qwen-carpmuscle-r-v0.3_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/TheTsar1209/qwen-carpmuscle-r-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TheTsar1209/qwen-carpmuscle-r-v0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TheTsar1209__qwen-carpmuscle-r-v0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TheTsar1209/qwen-carpmuscle-r-v0.3
30f8221d2f5f587343b1dbd65cf7d9bda4f5ef16
31.937556
1
14
false
false
false
true
2.256996
0.445509
44.550903
0.622712
46.375914
0.296828
29.682779
0.350671
13.422819
0.42776
12.003385
0.510306
45.589539
false
false
2024-10-23
2024-10-23
1
TheTsar1209/qwen-carpmuscle-r-v0.3 (Merge)
TheTsar1209_qwen-carpmuscle-v0.1_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/TheTsar1209/qwen-carpmuscle-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TheTsar1209/qwen-carpmuscle-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TheTsar1209__qwen-carpmuscle-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TheTsar1209/qwen-carpmuscle-v0.1
7c7b06a1788aef48054c3c6d6ad90c6dc5264a81
32.916329
apache-2.0
0
14
true
false
false
true
2.176218
0.562163
56.216284
0.64343
48.825595
0.231118
23.111782
0.34396
12.527964
0.416104
10.146354
0.52003
46.669991
false
false
2024-10-05
2024-10-10
3
Qwen/Qwen2.5-14B
TheTsar1209_qwen-carpmuscle-v0.2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/TheTsar1209/qwen-carpmuscle-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TheTsar1209/qwen-carpmuscle-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TheTsar1209__qwen-carpmuscle-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TheTsar1209/qwen-carpmuscle-v0.2
081f6b067ebca9bc384af283f1d267880534b8e3
33.477891
apache-2.0
0
14
true
false
false
true
2.248199
0.525693
52.569294
0.638692
48.182441
0.271903
27.190332
0.355705
14.09396
0.434552
12.752344
0.514711
46.078975
false
false
2024-10-16
2024-10-19
3
Qwen/Qwen2.5-14B
TheTsar1209_qwen-carpmuscle-v0.3_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/TheTsar1209/qwen-carpmuscle-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TheTsar1209/qwen-carpmuscle-v0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TheTsar1209__qwen-carpmuscle-v0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TheTsar1209/qwen-carpmuscle-v0.3
ec92820e4ff36b6f21e1ef63546fe2ddcb34456a
31.227939
apache-2.0
0
14
true
false
false
true
4.230979
0.447632
44.763228
0.615153
45.543392
0.279456
27.945619
0.356544
14.205817
0.413188
9.781771
0.50615
45.127807
false
false
2024-10-28
2024-10-28
2
Qwen/Qwen2.5-14B
TheTsar1209_qwen-carpmuscle-v0.4_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/TheTsar1209/qwen-carpmuscle-v0.4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TheTsar1209/qwen-carpmuscle-v0.4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TheTsar1209__qwen-carpmuscle-v0.4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TheTsar1209/qwen-carpmuscle-v0.4
3e11d5aad0f19bd652b8605620d0cf6af7a0ea00
35.669388
apache-2.0
1
14
true
false
false
true
1.369624
0.720207
72.020683
0.645367
49.384956
0.173716
17.371601
0.352349
13.646532
0.451604
15.550521
0.514378
46.042036
false
false
2024-11-18
2024-11-18
3
Qwen/Qwen2.5-14B
Tijmen2_cosmosage-v3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Tijmen2/cosmosage-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Tijmen2/cosmosage-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Tijmen2__cosmosage-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Tijmen2/cosmosage-v3
e6d4b4e6868fcf113ab5261d71c7214a1f7fbb0c
16.813458
mit
1
8
true
false
false
true
0.830859
0.448232
44.82318
0.455064
22.687106
0.018127
1.812689
0.282718
4.362416
0.419885
10.685677
0.248587
16.509678
false
false
2024-06-20
2024-08-27
1
meta-llama/Meta-Llama-3-8B
TinyLlama_TinyLlama-1.1B-Chat-v0.1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/TinyLlama/TinyLlama-1.1B-Chat-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TinyLlama/TinyLlama-1.1B-Chat-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TinyLlama__TinyLlama-1.1B-Chat-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TinyLlama/TinyLlama-1.1B-Chat-v0.1
7abc14e7779eabc3a028bc695342869d0410dea2
3.856872
apache-2.0
53
1
true
false
false
false
0.091095
0.147854
14.785436
0.308353
3.363011
0
0
0.229027
0
0.35924
3.904948
0.109791
1.08784
false
true
2023-09-16
2024-12-02
0
TinyLlama/TinyLlama-1.1B-Chat-v0.1
TinyLlama_TinyLlama-1.1B-Chat-v0.5_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/TinyLlama/TinyLlama-1.1B-Chat-v0.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TinyLlama/TinyLlama-1.1B-Chat-v0.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TinyLlama__TinyLlama-1.1B-Chat-v0.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TinyLlama/TinyLlama-1.1B-Chat-v0.5
5c9e70dd07f5234bf6bf6a2425fffeecd5a6020b
4.075811
apache-2.0
8
1
true
false
false
false
0.094964
0.163367
16.336653
0.310505
3.407691
0.000755
0.075529
0.248322
0
0.366125
3.565625
0.109624
1.069371
false
true
2023-11-20
2024-10-23
0
TinyLlama/TinyLlama-1.1B-Chat-v0.5
TinyLlama_TinyLlama-1.1B-Chat-v0.6_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/TinyLlama/TinyLlama-1.1B-Chat-v0.6" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TinyLlama/TinyLlama-1.1B-Chat-v0.6</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TinyLlama__TinyLlama-1.1B-Chat-v0.6-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TinyLlama/TinyLlama-1.1B-Chat-v0.6
bf9ae1c8bf026667e6f810768de259bb4a7f4777
4.092866
apache-2.0
91
1
true
false
false
true
0.430347
0.157421
15.74212
0.306698
3.390371
0.003776
0.377644
0.258389
1.118568
0.342219
2.277344
0.11486
1.651152
false
true
2023-11-20
2024-10-23
0
TinyLlama/TinyLlama-1.1B-Chat-v0.6
TinyLlama_TinyLlama-1.1B-Chat-v1.0_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/TinyLlama/TinyLlama-1.1B-Chat-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TinyLlama/TinyLlama-1.1B-Chat-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TinyLlama__TinyLlama-1.1B-Chat-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TinyLlama/TinyLlama-1.1B-Chat-v1.0
fe8a4ea1ffedaf415f4da2f062534de366a451e6
2.718155
apache-2.0
1,099
1
true
false
false
false
0.268441
0.059576
5.957637
0.310356
4.013397
0.009063
0.906344
0.25
0
0.351521
4.306771
0.110123
1.124778
false
true
2023-12-30
2024-08-04
0
TinyLlama/TinyLlama-1.1B-Chat-v1.0
TinyLlama_TinyLlama-1.1B-intermediate-step-1431k-3T_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TinyLlama__TinyLlama-1.1B-intermediate-step-1431k-3T-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T
59f6f375b26bde864a6ca194a9a3044570490064
5.167378
apache-2.0
163
1
true
false
false
false
0.165798
0.227664
22.766371
0.307119
3.547093
0.008308
0.830816
0.252517
0.33557
0.338031
2.18724
0.112035
1.337175
false
true
2023-12-28
2024-11-27
0
TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T
TinyLlama_TinyLlama_v1.1_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/TinyLlama/TinyLlama_v1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TinyLlama/TinyLlama_v1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TinyLlama__TinyLlama_v1.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TinyLlama/TinyLlama_v1.1
ff3c701f2424c7625fdefb9dd470f45ef18b02d6
4.723849
apache-2.0
77
1
true
false
false
false
0.248929
0.200061
20.006139
0.30237
3.210301
0.006042
0.60423
0.245805
0
0.369969
3.979427
0.104887
0.542996
false
true
2024-03-09
2024-06-12
0
TinyLlama/TinyLlama_v1.1
Trappu_Magnum-Picaro-0.7-v2-12b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Trappu/Magnum-Picaro-0.7-v2-12b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Trappu/Magnum-Picaro-0.7-v2-12b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Trappu__Magnum-Picaro-0.7-v2-12b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Trappu/Magnum-Picaro-0.7-v2-12b
2ffc46cde49eb823f5588990bd6b848cd505271e
21.478302
apache-2.0
7
12
true
false
false
false
1.674959
0.300279
30.027882
0.550666
35.746233
0.05136
5.135952
0.322987
9.731544
0.472719
19.55651
0.358045
28.67169
true
false
2024-09-11
2024-09-12
1
Trappu/Magnum-Picaro-0.7-v2-12b (Merge)
Trappu_Nemo-Picaro-12B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Trappu/Nemo-Picaro-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Trappu/Nemo-Picaro-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Trappu__Nemo-Picaro-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Trappu/Nemo-Picaro-12B
d65bf383d744998ae93a5589ec886532bb7e18eb
21.324728
apache-2.0
1
12
true
false
false
false
1.841028
0.257714
25.771398
0.548959
35.973135
0.082326
8.232628
0.327181
10.290828
0.472594
18.740885
0.360455
28.939495
false
false
2024-09-10
2024-09-22
2
royallab/MN-LooseCannon-12B-v2 (Merge)
Tremontaine_L3-12B-Lunaris-v1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Tremontaine/L3-12B-Lunaris-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Tremontaine/L3-12B-Lunaris-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Tremontaine__L3-12B-Lunaris-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Tremontaine/L3-12B-Lunaris-v1
7be236530a835416ebca712d51d661c4488a45de
25.502431
2
11
false
false
false
true
1.140964
0.690931
69.093117
0.523022
32.180746
0.089124
8.912387
0.309564
7.941834
0.367365
4.053906
0.377493
30.832595
false
false
2024-07-14
2024-07-15
1
Tremontaine/L3-12B-Lunaris-v1 (Merge)
Tsunami-th_Tsunami-0.5-7B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Tsunami-th/Tsunami-0.5-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Tsunami-th/Tsunami-0.5-7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Tsunami-th__Tsunami-0.5-7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Tsunami-th/Tsunami-0.5-7B-Instruct
10706336513d54c4e8962f54653f25941c4031f4
28.043411
apache-2.0
0
7
true
false
false
true
1.090053
0.740015
74.001538
0.552369
36.138254
0.001511
0.151057
0.308725
7.829978
0.425719
12.214844
0.441323
37.924793
false
false
2024-10-11
2024-10-12
1
Tsunami-th/Tsunami-0.5-7B-Instruct (Merge)
Tsunami-th_Tsunami-0.5x-7B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Tsunami-th/Tsunami-0.5x-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Tsunami-th/Tsunami-0.5x-7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Tsunami-th__Tsunami-0.5x-7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Tsunami-th/Tsunami-0.5x-7B-Instruct
83d048ab565893a660fa7eaeb4a749d360c76b53
29.823981
apache-2.0
1
7
true
false
false
true
1.058563
0.709915
70.991525
0.559287
37.363061
0.049849
4.984894
0.314597
8.612975
0.466677
18.567969
0.445811
38.423463
false
false
2024-10-15
2024-10-16
1
Tsunami-th/Tsunami-0.5x-7B-Instruct (Merge)
Tsunami-th_Tsunami-1.0-14B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Tsunami-th/Tsunami-1.0-14B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Tsunami-th/Tsunami-1.0-14B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Tsunami-th__Tsunami-1.0-14B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Tsunami-th/Tsunami-1.0-14B-Instruct
b468814b5242acbe6294226db71bc19dead6c8b6
34.199058
apache-2.0
0
14
true
false
false
true
1.653631
0.782905
78.290491
0.643876
49.150255
0
0
0.356544
14.205817
0.445938
16.342187
0.52485
47.2056
false
false
2024-10-25
2024-10-25
1
Tsunami-th/Tsunami-1.0-14B-Instruct (Merge)
Tsunami-th_Tsunami-1.0-7B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Tsunami-th/Tsunami-1.0-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Tsunami-th/Tsunami-1.0-7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Tsunami-th__Tsunami-1.0-7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Tsunami-th/Tsunami-1.0-7B-Instruct
34d0f8da8ce6b0de50a269eef622ff2e93e5c059
28.762308
apache-2.0
1
7
true
false
false
true
1.9814
0.730873
73.087297
0.549071
35.857243
0.01435
1.435045
0.312919
8.389262
0.449281
15.760156
0.442404
38.044843
false
false
2024-10-28
2024-10-28
1
Tsunami-th/Tsunami-1.0-7B-Instruct (Merge)
UCLA-AGI_Gemma-2-9B-It-SPPO-Iter1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/UCLA-AGI/Gemma-2-9B-It-SPPO-Iter1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">UCLA-AGI/Gemma-2-9B-It-SPPO-Iter1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/UCLA-AGI__Gemma-2-9B-It-SPPO-Iter1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
UCLA-AGI/Gemma-2-9B-It-SPPO-Iter1
33cfd6919f22efc38f71e9d21a7e697afb418e6b
21.088169
gemma
3
9
true
false
false
true
2.942658
0.308221
30.822108
0.596893
41.80923
0
0
0.336409
11.521253
0.409938
10.075521
0.390708
32.300901
false
false
2024-06-29
2024-09-21
0
UCLA-AGI/Gemma-2-9B-It-SPPO-Iter1
UCLA-AGI_Gemma-2-9B-It-SPPO-Iter2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/UCLA-AGI/Gemma-2-9B-It-SPPO-Iter2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">UCLA-AGI/Gemma-2-9B-It-SPPO-Iter2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/UCLA-AGI__Gemma-2-9B-It-SPPO-Iter2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
UCLA-AGI/Gemma-2-9B-It-SPPO-Iter2
b7590721d92bf6e0606e3dbc1ca2c229b7c534b4
21.216144
gemma
2
9
true
false
false
true
2.716462
0.31002
31.001964
0.598988
42.169834
0
0
0.334732
11.297539
0.413938
10.942188
0.386968
31.885343
false
false
2024-06-29
2024-08-07
0
UCLA-AGI/Gemma-2-9B-It-SPPO-Iter2
UCLA-AGI_Gemma-2-9B-It-SPPO-Iter3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/UCLA-AGI/Gemma-2-9B-It-SPPO-Iter3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">UCLA-AGI/Gemma-2-9B-It-SPPO-Iter3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/UCLA-AGI__Gemma-2-9B-It-SPPO-Iter3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
UCLA-AGI/Gemma-2-9B-It-SPPO-Iter3
2261f2a03b2e15de13a18da52590c237ecf5f188
21.46718
gemma
116
9
true
false
false
true
2.81515
0.316714
31.67141
0.600708
42.536752
0
0
0.338926
11.856823
0.416604
11.342188
0.382563
31.395907
false
false
2024-06-29
2024-07-31
0
UCLA-AGI/Gemma-2-9B-It-SPPO-Iter3
UCLA-AGI_Llama-3-Instruct-8B-SPPO-Iter1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/UCLA-AGI/Llama-3-Instruct-8B-SPPO-Iter1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">UCLA-AGI/Llama-3-Instruct-8B-SPPO-Iter1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/UCLA-AGI__Llama-3-Instruct-8B-SPPO-Iter1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
UCLA-AGI/Llama-3-Instruct-8B-SPPO-Iter1
2076437f65776aeb9686c95f1f41515f70c4db27
24.640077
apache-2.0
0
8
true
false
false
true
0.701229
0.729899
72.989889
0.505789
29.489353
0.107251
10.725076
0.267617
2.348993
0.356792
2.165625
0.371094
30.121528
false
false
2024-06-25
2024-09-21
0
UCLA-AGI/Llama-3-Instruct-8B-SPPO-Iter1
UCLA-AGI_Llama-3-Instruct-8B-SPPO-Iter2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/UCLA-AGI/Llama-3-Instruct-8B-SPPO-Iter2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">UCLA-AGI/Llama-3-Instruct-8B-SPPO-Iter2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/UCLA-AGI__Llama-3-Instruct-8B-SPPO-Iter2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
UCLA-AGI/Llama-3-Instruct-8B-SPPO-Iter2
730c7207d4b538feeb3c2e6d6f6a6ba8615a9be3
23.927649
apache-2.0
0
8
true
false
false
true
0.656767
0.698875
69.887454
0.50887
29.869449
0.096677
9.667674
0.266779
2.237136
0.359427
1.995052
0.369182
29.909131
false
false
2024-06-25
2024-08-07
0
UCLA-AGI/Llama-3-Instruct-8B-SPPO-Iter2
UCLA-AGI_Llama-3-Instruct-8B-SPPO-Iter3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/UCLA-AGI/Llama-3-Instruct-8B-SPPO-Iter3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">UCLA-AGI/Llama-3-Instruct-8B-SPPO-Iter3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/UCLA-AGI__Llama-3-Instruct-8B-SPPO-Iter3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
UCLA-AGI/Llama-3-Instruct-8B-SPPO-Iter3
f73dafc2923acd56f115f21f76e9d14f8d19a63e
23.479398
apache-2.0
77
8
true
false
false
true
8.201812
0.683412
68.341224
0.507958
29.739684
0.083082
8.308157
0.265101
2.013423
0.366062
3.091146
0.364445
29.382757
false
false
2024-06-25
2024-07-02
0
UCLA-AGI/Llama-3-Instruct-8B-SPPO-Iter3
UCLA-AGI_Llama-3-Instruct-8B-SPPO-Iter3_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/UCLA-AGI/Llama-3-Instruct-8B-SPPO-Iter3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">UCLA-AGI/Llama-3-Instruct-8B-SPPO-Iter3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/UCLA-AGI__Llama-3-Instruct-8B-SPPO-Iter3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
UCLA-AGI/Llama-3-Instruct-8B-SPPO-Iter3
f73dafc2923acd56f115f21f76e9d14f8d19a63e
23.05947
apache-2.0
77
8
true
false
false
true
0.910475
0.670298
67.029814
0.507641
29.716701
0.071752
7.175227
0.265101
2.013423
0.364729
2.891146
0.365775
29.530511
false
false
2024-06-25
2024-06-28
0
UCLA-AGI/Llama-3-Instruct-8B-SPPO-Iter3
UCLA-AGI_Mistral7B-PairRM-SPPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/UCLA-AGI/Mistral7B-PairRM-SPPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">UCLA-AGI/Mistral7B-PairRM-SPPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/UCLA-AGI__Mistral7B-PairRM-SPPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
UCLA-AGI/Mistral7B-PairRM-SPPO
abdc173603690fcf6b333b351c291a321d2631c3
16.35658
apache-2.0
6
7
true
false
false
true
0.504159
0.435492
43.549227
0.443898
22.084656
0.02568
2.567976
0.28104
4.138702
0.396479
7.793229
0.262051
18.005689
false
false
2024-05-04
2024-09-21
0
UCLA-AGI/Mistral7B-PairRM-SPPO
UCLA-AGI_Mistral7B-PairRM-SPPO-Iter1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/UCLA-AGI/Mistral7B-PairRM-SPPO-Iter1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">UCLA-AGI/Mistral7B-PairRM-SPPO-Iter1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/UCLA-AGI__Mistral7B-PairRM-SPPO-Iter1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
UCLA-AGI/Mistral7B-PairRM-SPPO-Iter1
97252e2d868725b2fa5055adc241c5182610fb6a
17.879981
apache-2.0
2
7
true
false
false
true
0.526674
0.504735
50.473521
0.446806
22.932292
0.022659
2.265861
0.283557
4.474273
0.399177
8.297135
0.269531
18.836806
false
false
2024-05-04
2024-09-21
0
UCLA-AGI/Mistral7B-PairRM-SPPO-Iter1
UCLA-AGI_Mistral7B-PairRM-SPPO-Iter2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/UCLA-AGI/Mistral7B-PairRM-SPPO-Iter2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">UCLA-AGI/Mistral7B-PairRM-SPPO-Iter2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/UCLA-AGI__Mistral7B-PairRM-SPPO-Iter2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
UCLA-AGI/Mistral7B-PairRM-SPPO-Iter2
8201064df67b5762ff9f361ff1b98aae3747855c
17.030023
apache-2.0
1
7
true
false
false
true
0.515484
0.444585
44.458481
0.446572
22.479924
0.016616
1.661631
0.288591
5.145414
0.408542
9.801042
0.267703
18.633644
false
false
2024-05-04
2024-08-07
0
UCLA-AGI/Mistral7B-PairRM-SPPO-Iter2
UCLA-AGI_Mistral7B-PairRM-SPPO-Iter3_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/UCLA-AGI/Mistral7B-PairRM-SPPO-Iter3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">UCLA-AGI/Mistral7B-PairRM-SPPO-Iter3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/UCLA-AGI__Mistral7B-PairRM-SPPO-Iter3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
UCLA-AGI/Mistral7B-PairRM-SPPO-Iter3
72cd8e5435ae679249ddad7ac4cdb64c5b4590c3
16.413129
apache-2.0
5
7
true
false
false
true
0.518408
0.435068
43.506784
0.439659
21.817496
0.018882
1.888218
0.275168
3.355705
0.407115
9.489323
0.265791
18.421247
false
false
2024-05-04
2024-08-07
0
UCLA-AGI/Mistral7B-PairRM-SPPO-Iter3
UKzExecution_LlamaExecutor-8B-3.0.5_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/UKzExecution/LlamaExecutor-8B-3.0.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">UKzExecution/LlamaExecutor-8B-3.0.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/UKzExecution__LlamaExecutor-8B-3.0.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
UKzExecution/LlamaExecutor-8B-3.0.5
2047978e8ab1146b8881cde3d998856594f437a4
24.490727
0
8
false
false
false
true
0.807002
0.74029
74.029021
0.5006
28.413815
0.098943
9.89426
0.255872
0.782998
0.375365
4.653906
0.362533
29.170361
false
false
2024-07-29
2024-07-30
1
UKzExecution/LlamaExecutor-8B-3.0.5 (Merge)
Unbabel_TowerInstruct-Mistral-7B-v0.2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Unbabel/TowerInstruct-Mistral-7B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Unbabel/TowerInstruct-Mistral-7B-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Unbabel__TowerInstruct-Mistral-7B-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Unbabel/TowerInstruct-Mistral-7B-v0.2
454bdfedc8b51f292a402aba2c560df145a0817d
11.827188
cc-by-nc-4.0
14
7
true
false
false
false
0.60389
0.284342
28.434221
0.388195
14.224326
0.015861
1.586103
0.247483
0
0.452229
15.961979
0.196809
10.756501
false
false
2024-03-26
2024-09-06
0
Unbabel/TowerInstruct-Mistral-7B-v0.2
Undi95_MG-FinalMix-72B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Undi95/MG-FinalMix-72B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/MG-FinalMix-72B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Undi95__MG-FinalMix-72B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/MG-FinalMix-72B
6c9c2f5d052495dcd49f44bf5623d21210653c65
43.693133
other
4
72
true
false
false
true
12.222024
0.801365
80.136482
0.697302
57.502412
0.361027
36.102719
0.385067
18.008949
0.482271
21.217187
0.542719
49.191046
true
false
2024-06-25
2024-07-13
1
Undi95/MG-FinalMix-72B (Merge)
VAGOsolutions_Llama-3-SauerkrautLM-70b-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/VAGOsolutions/Llama-3-SauerkrautLM-70b-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAGOsolutions/Llama-3-SauerkrautLM-70b-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/VAGOsolutions__Llama-3-SauerkrautLM-70b-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VAGOsolutions/Llama-3-SauerkrautLM-70b-Instruct
707cfd1a93875247c0223e0c7e3d86d58c432318
38.169233
other
23
70
true
false
false
true
10.626191
0.804462
80.446216
0.666325
52.02958
0.237915
23.791541
0.32802
10.402685
0.433938
13.542188
0.539229
48.803191
false
false
2024-04-24
2024-06-26
0
VAGOsolutions/Llama-3-SauerkrautLM-70b-Instruct
VAGOsolutions_Llama-3-SauerkrautLM-8b-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/VAGOsolutions/Llama-3-SauerkrautLM-8b-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAGOsolutions/Llama-3-SauerkrautLM-8b-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/VAGOsolutions__Llama-3-SauerkrautLM-8b-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VAGOsolutions/Llama-3-SauerkrautLM-8b-Instruct
37127c44d7c0fb56cef817270c4b1a6802d8793a
26.667655
other
53
8
true
false
false
true
0.795694
0.744537
74.453672
0.494338
28.049242
0.066465
6.646526
0.308725
7.829978
0.424104
11.279688
0.385721
31.746823
false
false
2024-04-19
2024-07-22
0
VAGOsolutions/Llama-3-SauerkrautLM-8b-Instruct
VAGOsolutions_Llama-3.1-SauerkrautLM-70b-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/VAGOsolutions/Llama-3.1-SauerkrautLM-70b-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAGOsolutions/Llama-3.1-SauerkrautLM-70b-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/VAGOsolutions__Llama-3.1-SauerkrautLM-70b-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VAGOsolutions/Llama-3.1-SauerkrautLM-70b-Instruct
e8e74aa789243c25a3a8f7565780a402f5050bbb
42.671071
llama3.1
19
70
true
false
false
true
15.091617
0.865637
86.563651
0.700625
57.241621
0.324773
32.477341
0.341443
12.192394
0.471083
19.385417
0.533494
48.166002
false
false
2024-07-29
2024-08-26
0
VAGOsolutions/Llama-3.1-SauerkrautLM-70b-Instruct
VAGOsolutions_Llama-3.1-SauerkrautLM-8b-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/VAGOsolutions/Llama-3.1-SauerkrautLM-8b-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAGOsolutions/Llama-3.1-SauerkrautLM-8b-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/VAGOsolutions__Llama-3.1-SauerkrautLM-8b-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VAGOsolutions/Llama-3.1-SauerkrautLM-8b-Instruct
23ca79966a4ab0a61f7ccc7a0454ffef553b66eb
28.68485
llama3.1
32
8
true
false
false
true
1.656874
0.801739
80.173938
0.511493
30.999361
0.119335
11.933535
0.290268
5.369128
0.414802
11.516927
0.389046
32.116209
false
false
2024-07-25
2024-07-29
0
VAGOsolutions/Llama-3.1-SauerkrautLM-8b-Instruct
VAGOsolutions_SauerkrautLM-1.5b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/VAGOsolutions/SauerkrautLM-1.5b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAGOsolutions/SauerkrautLM-1.5b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/VAGOsolutions__SauerkrautLM-1.5b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VAGOsolutions/SauerkrautLM-1.5b
8f5170f03e6b0355dd920adc3a7e65d0417ee14e
10.122505
apache-2.0
11
1
true
false
false
true
0.776237
0.240403
24.040324
0.370391
13.419518
0.02719
2.719033
0.270973
2.796421
0.373906
4.971615
0.215093
12.788121
false
false
2024-06-12
2024-06-26
0
VAGOsolutions/SauerkrautLM-1.5b
VAGOsolutions_SauerkrautLM-7b-HerO_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/VAGOsolutions/SauerkrautLM-7b-HerO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAGOsolutions/SauerkrautLM-7b-HerO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/VAGOsolutions__SauerkrautLM-7b-HerO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VAGOsolutions/SauerkrautLM-7b-HerO
3a14b437e2f375b74de3b6923e171662133347bb
19.694488
apache-2.0
32
7
true
false
false
true
0.569769
0.53461
53.461039
0.490443
27.991874
0.040785
4.07855
0.272651
3.020134
0.392385
6.88151
0.304604
22.733821
true
false
2023-11-24
2024-06-26
0
VAGOsolutions/SauerkrautLM-7b-HerO
VAGOsolutions_SauerkrautLM-7b-LaserChat_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/VAGOsolutions/SauerkrautLM-7b-LaserChat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAGOsolutions/SauerkrautLM-7b-LaserChat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/VAGOsolutions__SauerkrautLM-7b-LaserChat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VAGOsolutions/SauerkrautLM-7b-LaserChat
cb759636a3d5b0768df2f43a3d3da9b17e10e7b9
22.134728
apache-2.0
12
7
true
false
false
true
0.604679
0.598782
59.878234
0.454327
22.99208
0.077039
7.703927
0.300336
6.711409
0.414802
9.916927
0.330452
25.605792
false
false
2024-02-05
2024-06-26
0
VAGOsolutions/SauerkrautLM-7b-LaserChat
VAGOsolutions_SauerkrautLM-Gemma-2b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/VAGOsolutions/SauerkrautLM-Gemma-2b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAGOsolutions/SauerkrautLM-Gemma-2b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/VAGOsolutions__SauerkrautLM-Gemma-2b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VAGOsolutions/SauerkrautLM-Gemma-2b
f9d5575c23da96f33ce77dea3b0776746b9469bc
7.61539
other
8
2
true
false
false
true
0.916235
0.247522
24.752213
0.341632
9.13387
0.021903
2.190332
0.256711
0.894855
0.367583
3.514583
0.146858
5.206486
false
false
2024-03-06
2024-06-26
0
VAGOsolutions/SauerkrautLM-Gemma-2b
VAGOsolutions_SauerkrautLM-Gemma-7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/VAGOsolutions/SauerkrautLM-Gemma-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAGOsolutions/SauerkrautLM-Gemma-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/VAGOsolutions__SauerkrautLM-Gemma-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VAGOsolutions/SauerkrautLM-Gemma-7b
4296bdabf82e900235b094e5348be03ebb0ec891
14.60057
other
13
8
true
false
false
true
1.513249
0.340671
34.067053
0.418791
18.492652
0.055136
5.513595
0.286074
4.809843
0.359427
2.928385
0.296127
21.791888
false
false
2024-02-27
2024-06-26
0
VAGOsolutions/SauerkrautLM-Gemma-7b
VAGOsolutions_SauerkrautLM-Mixtral-8x7B-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/VAGOsolutions/SauerkrautLM-Mixtral-8x7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAGOsolutions/SauerkrautLM-Mixtral-8x7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/VAGOsolutions__SauerkrautLM-Mixtral-8x7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VAGOsolutions/SauerkrautLM-Mixtral-8x7B-Instruct
30ed549de7d84f68b4c6cb619f73275c99af23cc
24.474879
apache-2.0
22
46
true
true
false
true
3.77439
0.560189
56.018919
0.527734
33.945163
0.097432
9.743202
0.297819
6.375839
0.420417
11.31875
0.365027
29.4474
false
false
2023-12-15
2024-06-26
0
VAGOsolutions/SauerkrautLM-Mixtral-8x7B-Instruct
VAGOsolutions_SauerkrautLM-Nemo-12b-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/VAGOsolutions/SauerkrautLM-Nemo-12b-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAGOsolutions/SauerkrautLM-Nemo-12b-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/VAGOsolutions__SauerkrautLM-Nemo-12b-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VAGOsolutions/SauerkrautLM-Nemo-12b-Instruct
fcb056465084ab2c71503a0760f46e4be79c985c
25.765909
apache-2.0
22
12
true
false
false
true
1.380527
0.611297
61.129691
0.521413
32.343783
0.095166
9.516616
0.309564
7.941834
0.446896
17.161979
0.338514
26.501551
false
false
2024-07-22
2024-07-22
0
VAGOsolutions/SauerkrautLM-Nemo-12b-Instruct
VAGOsolutions_SauerkrautLM-Phi-3-medium_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/VAGOsolutions/SauerkrautLM-Phi-3-medium" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAGOsolutions/SauerkrautLM-Phi-3-medium</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/VAGOsolutions__SauerkrautLM-Phi-3-medium-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VAGOsolutions/SauerkrautLM-Phi-3-medium
ebfed26a2b35ede15fe526f57029e0ad866ac66d
30.319798
mit
9
13
true
false
false
false
0.782748
0.440888
44.088796
0.643293
49.63035
0.154834
15.483384
0.334732
11.297539
0.4845
20.695833
0.466506
40.722887
false
false
2024-06-09
2024-09-19
0
VAGOsolutions/SauerkrautLM-Phi-3-medium
VAGOsolutions_SauerkrautLM-SOLAR-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/VAGOsolutions/SauerkrautLM-SOLAR-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAGOsolutions/SauerkrautLM-SOLAR-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/VAGOsolutions__SauerkrautLM-SOLAR-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VAGOsolutions/SauerkrautLM-SOLAR-Instruct
2665d7600ccd253728453433d2434844e6f702bd
20.164244
cc-by-nc-4.0
47
10
true
false
false
true
0.81573
0.491721
49.172086
0.516945
31.83892
0
0
0.305369
7.38255
0.396542
8.334375
0.318318
24.257535
false
false
2023-12-20
2024-06-26
0
VAGOsolutions/SauerkrautLM-SOLAR-Instruct
VAGOsolutions_SauerkrautLM-gemma-2-2b-it_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/VAGOsolutions/SauerkrautLM-gemma-2-2b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAGOsolutions/SauerkrautLM-gemma-2-2b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/VAGOsolutions__SauerkrautLM-gemma-2-2b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VAGOsolutions/SauerkrautLM-gemma-2-2b-it
7fd35fcb32aebfc422e535739161d7528fc562d5
10.465202
gemma
8
2
true
false
false
true
3.159145
0.132066
13.206625
0.424084
18.914195
0.000755
0.075529
0.272651
3.020134
0.399458
8.765625
0.269282
18.809102
false
false
2024-08-03
2024-08-26
0
VAGOsolutions/SauerkrautLM-gemma-2-2b-it
VAGOsolutions_SauerkrautLM-gemma-2-9b-it_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/VAGOsolutions/SauerkrautLM-gemma-2-9b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAGOsolutions/SauerkrautLM-gemma-2-9b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/VAGOsolutions__SauerkrautLM-gemma-2-9b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VAGOsolutions/SauerkrautLM-gemma-2-9b-it
8e02fc1c24e0499c74ee1186ddc46b989fe497f1
21.83265
gemma
5
9
true
false
false
true
2.906068
0.302401
30.240096
0.607265
43.249989
0.005287
0.528701
0.327181
10.290828
0.431823
12.344531
0.409076
34.341755
false
false
2024-08-12
2024-08-26
0
VAGOsolutions/SauerkrautLM-gemma-2-9b-it
VAGOsolutions_SauerkrautLM-v2-14b-DPO_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/VAGOsolutions/SauerkrautLM-v2-14b-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAGOsolutions/SauerkrautLM-v2-14b-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/VAGOsolutions__SauerkrautLM-v2-14b-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VAGOsolutions/SauerkrautLM-v2-14b-DPO
1fbe5364bc443255a06df7fa0debbcc3d38ab866
36.866369
apache-2.0
16
14
true
false
false
true
1.477232
0.741165
74.116455
0.656037
50.926132
0.273414
27.34139
0.319631
9.284116
0.437469
13.783594
0.511719
45.746528
false
false
2024-10-31
2024-11-04
1
VAGOsolutions/SauerkrautLM-v2-14b-DPO (Merge)
VAGOsolutions_SauerkrautLM-v2-14b-SFT_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/VAGOsolutions/SauerkrautLM-v2-14b-SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAGOsolutions/SauerkrautLM-v2-14b-SFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/VAGOsolutions__SauerkrautLM-v2-14b-SFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VAGOsolutions/SauerkrautLM-v2-14b-SFT
606ddc7819d4a5d9cd8618d5ede57e2bdd99a1ed
35.649022
apache-2.0
7
14
true
false
false
true
1.51631
0.696377
69.637672
0.621036
45.824351
0.292296
29.229607
0.33557
11.409396
0.417875
11.067708
0.520529
46.725399
false
false
2024-10-25
2024-11-04
1
VAGOsolutions/SauerkrautLM-v2-14b-SFT (Merge)
VIRNECT_llama-3-Korean-8B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/VIRNECT/llama-3-Korean-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VIRNECT/llama-3-Korean-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/VIRNECT__llama-3-Korean-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VIRNECT/llama-3-Korean-8B
c658409e094ff04eeb6ab6cee2d4bc56716e45f1
20.245301
llama3
0
8
true
false
false
true
0.812687
0.505835
50.583452
0.490825
27.322412
0.0929
9.29003
0.270973
2.796421
0.366156
3.269531
0.35389
28.209959
false
false
2024-07-17
2024-07-17
0
VIRNECT/llama-3-Korean-8B
VIRNECT_llama-3-Korean-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/VIRNECT/llama-3-Korean-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VIRNECT/llama-3-Korean-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/VIRNECT__llama-3-Korean-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VIRNECT/llama-3-Korean-8B
c658409e094ff04eeb6ab6cee2d4bc56716e45f1
20.406433
llama3
0
8
true
false
false
true
0.820307
0.502138
50.213766
0.491838
27.564319
0.106495
10.649547
0.270973
2.796421
0.364792
3.032292
0.35364
28.182255
false
false
2024-07-17
2024-07-17
0
VIRNECT/llama-3-Korean-8B
VIRNECT_llama-3-Korean-8B-r-v-0.1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/VIRNECT/llama-3-Korean-8B-r-v-0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VIRNECT/llama-3-Korean-8B-r-v-0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/VIRNECT__llama-3-Korean-8B-r-v-0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VIRNECT/llama-3-Korean-8B-r-v-0.1
10acb1aa4f341f2d3c899d78c520b0822a909b95
18.67375
llama3
0
16
true
false
false
true
1.199491
0.491571
49.157125
0.480616
25.884954
0.081571
8.1571
0.24245
0
0.36749
3.736198
0.325964
25.107122
false
false
2024-07-18
2024-07-18
2
MLP-KTLim/llama-3-Korean-Bllossom-8B (Merge)
ValiantLabs_Llama3-70B-Fireplace_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ValiantLabs/Llama3-70B-Fireplace" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ValiantLabs/Llama3-70B-Fireplace</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ValiantLabs__Llama3-70B-Fireplace-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ValiantLabs/Llama3-70B-Fireplace
220079e4115733991eb19c30d5480db9696a665e
37.150403
llama3
3
70
true
false
false
true
9.692172
0.77736
77.735963
0.648899
49.55653
0.216012
21.601208
0.354866
13.982103
0.444854
16.773438
0.489279
43.253177
false
false
2024-05-09
2024-06-26
0
ValiantLabs/Llama3-70B-Fireplace
ValiantLabs_Llama3-70B-ShiningValiant2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ValiantLabs/Llama3-70B-ShiningValiant2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ValiantLabs/Llama3-70B-ShiningValiant2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ValiantLabs__Llama3-70B-ShiningValiant2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ValiantLabs/Llama3-70B-ShiningValiant2
bd6cce8da08ccefe9ec58cae3df4bf75c97d8950
30.603092
llama3
4
70
true
false
false
true
11.217594
0.612171
61.217126
0.633834
46.710261
0.08006
8.006042
0.330537
10.738255
0.432573
13.638281
0.489777
43.308585
false
false
2024-04-20
2024-07-25
0
ValiantLabs/Llama3-70B-ShiningValiant2
ValiantLabs_Llama3.1-70B-ShiningValiant2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ValiantLabs/Llama3.1-70B-ShiningValiant2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ValiantLabs/Llama3.1-70B-ShiningValiant2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ValiantLabs__Llama3.1-70B-ShiningValiant2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ValiantLabs/Llama3.1-70B-ShiningValiant2
55436621ed65f0b79e7c6324b780bd6a18e06c79
36.165893
llama3.1
3
70
true
false
false
false
13.997285
0.535535
53.55346
0.673841
52.390969
0.271903
27.190332
0.392617
19.01566
0.468104
18.479688
0.517287
46.365248
false
false
2024-10-30
2024-10-30
2
meta-llama/Meta-Llama-3.1-70B
ValiantLabs_Llama3.1-8B-Cobalt_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ValiantLabs/Llama3.1-8B-Cobalt" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ValiantLabs/Llama3.1-8B-Cobalt</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ValiantLabs__Llama3.1-8B-Cobalt-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ValiantLabs/Llama3.1-8B-Cobalt
3a69145a2acc1f7f51735aa3ae5d81c090249c65
20.214218
llama3.1
6
8
true
false
false
false
1.779277
0.349613
34.961347
0.494677
27.417777
0.125378
12.537764
0.303691
7.158837
0.395948
9.826823
0.364445
29.382757
false
false
2024-08-16
2024-10-02
2
meta-llama/Meta-Llama-3.1-8B
ValiantLabs_Llama3.1-8B-Cobalt_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ValiantLabs/Llama3.1-8B-Cobalt" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ValiantLabs/Llama3.1-8B-Cobalt</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ValiantLabs__Llama3.1-8B-Cobalt-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ValiantLabs/Llama3.1-8B-Cobalt
3a69145a2acc1f7f51735aa3ae5d81c090249c65
25.558664
llama3.1
6
8
true
false
false
true
0.938171
0.716835
71.683467
0.49107
27.235483
0.153323
15.332326
0.286074
4.809843
0.35124
4.704948
0.366273
29.585919
false
false
2024-08-16
2024-09-20
2
meta-llama/Meta-Llama-3.1-8B
ValiantLabs_Llama3.1-8B-Enigma_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ValiantLabs/Llama3.1-8B-Enigma" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ValiantLabs/Llama3.1-8B-Enigma</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ValiantLabs__Llama3.1-8B-Enigma-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ValiantLabs/Llama3.1-8B-Enigma
332c99d80f378c77b090745a5aac10f8ab339519
16.599981
llama3.1
8
8
true
false
false
false
6.415623
0.268055
26.805543
0.44776
22.012915
0.087613
8.761329
0.287752
5.033557
0.419604
10.217188
0.340924
26.769356
false
false
2024-08-11
2024-10-02
2
meta-llama/Meta-Llama-3.1-8B
ValiantLabs_Llama3.1-8B-Esper2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ValiantLabs/Llama3.1-8B-Esper2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ValiantLabs/Llama3.1-8B-Esper2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ValiantLabs__Llama3.1-8B-Esper2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ValiantLabs/Llama3.1-8B-Esper2
38f24f2fe90f839acbc57e7530221acf1232e9dc
13.840105
llama3.1
2
8
true
false
false
false
0.876765
0.25674
25.673989
0.446987
22.195685
0.05287
5.287009
0.272651
3.020134
0.356073
5.709115
0.290392
21.154699
false
false
2024-10-02
2024-10-09
2
meta-llama/Meta-Llama-3.1-8B
ValiantLabs_Llama3.1-8B-Fireplace2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ValiantLabs/Llama3.1-8B-Fireplace2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ValiantLabs/Llama3.1-8B-Fireplace2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ValiantLabs__Llama3.1-8B-Fireplace2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ValiantLabs/Llama3.1-8B-Fireplace2
be3a5c18b5e8e86a3703df1a8227f784ad2c713c
18.312602
llama3.1
6
8
true
false
false
true
0.916234
0.548324
54.8324
0.460982
24.070273
0.058157
5.81571
0.288591
5.145414
0.343302
4.379427
0.240691
15.632388
false
false
2024-07-23
2024-07-25
2
meta-llama/Meta-Llama-3.1-8B
ValiantLabs_Llama3.1-8B-Fireplace2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ValiantLabs/Llama3.1-8B-Fireplace2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ValiantLabs/Llama3.1-8B-Fireplace2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ValiantLabs__Llama3.1-8B-Fireplace2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ValiantLabs/Llama3.1-8B-Fireplace2
ef129903bbdcc59efdbe10fe9061bff473334a99
18.218113
llama3.1
6
8
true
false
false
true
0.901098
0.532812
53.281183
0.461331
24.089954
0.066465
6.646526
0.28943
5.257271
0.336667
4.216667
0.242354
15.81708
false
false
2024-07-23
2024-08-10
2
meta-llama/Meta-Llama-3.1-8B
ValiantLabs_Llama3.1-8B-ShiningValiant2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ValiantLabs/Llama3.1-8B-ShiningValiant2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ValiantLabs/Llama3.1-8B-ShiningValiant2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ValiantLabs__Llama3.1-8B-ShiningValiant2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ValiantLabs/Llama3.1-8B-ShiningValiant2
6b2b5694a192cb29ad0e4314138affa25b630c0e
24.36574
llama3.1
16
8
true
false
false
true
1.675031
0.649565
64.956538
0.477391
26.346119
0.129154
12.915408
0.310403
8.053691
0.390865
7.458073
0.338182
26.464613
false
false
2024-08-06
2024-08-10
2
meta-llama/Meta-Llama-3.1-8B
ValiantLabs_Llama3.1-8B-ShiningValiant2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ValiantLabs/Llama3.1-8B-ShiningValiant2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ValiantLabs/Llama3.1-8B-ShiningValiant2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ValiantLabs__Llama3.1-8B-ShiningValiant2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ValiantLabs/Llama3.1-8B-ShiningValiant2
6b2b5694a192cb29ad0e4314138affa25b630c0e
15.458036
llama3.1
16
8
true
false
false
false
6.528982
0.267806
26.780609
0.442929
21.61815
0.052115
5.21148
0.302013
6.935123
0.395917
10.789583
0.292719
21.413268
false
false
2024-08-06
2024-11-05
2
meta-llama/Meta-Llama-3.1-8B
ValiantLabs_Llama3.2-3B-Enigma_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ValiantLabs/Llama3.2-3B-Enigma" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ValiantLabs/Llama3.2-3B-Enigma</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ValiantLabs__Llama3.2-3B-Enigma-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ValiantLabs/Llama3.2-3B-Enigma
ca6adf3a289ce47c7598139e7a312e2b4b3708ce
11.642378
llama3.2
7
3
true
false
false
false
1.505355
0.278622
27.862183
0.372259
12.434026
0.040785
4.07855
0.261745
1.565996
0.392135
8.05026
0.242769
15.863254
false
false
2024-09-30
2024-10-02
1
meta-llama/Llama-3.2-3B-Instruct
ValiantLabs_Llama3.2-3B-Esper2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ValiantLabs/Llama3.2-3B-Esper2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ValiantLabs/Llama3.2-3B-Esper2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ValiantLabs__Llama3.2-3B-Esper2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ValiantLabs/Llama3.2-3B-Esper2
64a2c619a2e1680ab42945fcf5b75a5242cab3a1
10.730297
llama3.2
3
3
true
false
false
false
0.738884
0.274975
27.497484
0.380826
13.851733
0.023414
2.34139
0.270134
2.684564
0.354958
4.036458
0.225731
13.970154
false
false
2024-10-03
2024-10-09
1
meta-llama/Llama-3.2-3B-Instruct
ValiantLabs_Llama3.2-3B-ShiningValiant2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ValiantLabs/Llama3.2-3B-ShiningValiant2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ValiantLabs/Llama3.2-3B-ShiningValiant2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ValiantLabs__Llama3.2-3B-ShiningValiant2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ValiantLabs/Llama3.2-3B-ShiningValiant2
1336e200485675c9b92baae17831eab17c601803
14.214462
llama3.2
3
3
true
false
false
false
2.856299
0.26251
26.251014
0.422593
18.912709
0.071752
7.175227
0.280201
4.026846
0.386646
8.597396
0.282912
20.323582
false
false
2024-09-27
2024-11-05
1
meta-llama/Llama-3.2-3B-Instruct
Vikhrmodels_Vikhr-Llama3.1-8B-Instruct-R-21-09-24_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Vikhrmodels/Vikhr-Llama3.1-8B-Instruct-R-21-09-24" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Vikhrmodels/Vikhr-Llama3.1-8B-Instruct-R-21-09-24</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Vikhrmodels__Vikhr-Llama3.1-8B-Instruct-R-21-09-24-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Vikhrmodels/Vikhr-Llama3.1-8B-Instruct-R-21-09-24
c0b57cf6d4444b35fc5cec0525ff5eef32af22c9
24.838839
apache-2.0
26
8
true
false
false
true
0.856612
0.643146
64.314574
0.527224
32.669417
0.186556
18.655589
0.244966
0
0.375396
5.091146
0.354721
28.302305
false
false
2024-09-20
2024-09-21
1
Vikhrmodels/Vikhr-Llama3.1-8B-Instruct-R-21-09-24 (Merge)
Vikhrmodels_Vikhr-Nemo-12B-Instruct-R-21-09-24_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Vikhrmodels/Vikhr-Nemo-12B-Instruct-R-21-09-24" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Vikhrmodels/Vikhr-Nemo-12B-Instruct-R-21-09-24</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Vikhrmodels__Vikhr-Nemo-12B-Instruct-R-21-09-24-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Vikhrmodels/Vikhr-Nemo-12B-Instruct-R-21-09-24
6abd887cb631f705042c9e8085615fe4d76e9779
24.403136
apache-2.0
94
12
true
false
false
true
1.720792
0.599932
59.993152
0.521231
31.414409
0.134441
13.444109
0.291107
5.480984
0.407302
9.446094
0.339761
26.640071
false
false
2024-09-20
2024-09-21
1
Vikhrmodels/Vikhr-Nemo-12B-Instruct-R-21-09-24 (Merge)
Weyaxi_Bagel-Hermes-2x34B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/Weyaxi/Bagel-Hermes-2x34B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Weyaxi/Bagel-Hermes-2x34B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Weyaxi__Bagel-Hermes-2x34B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Weyaxi/Bagel-Hermes-2x34B
44fddd32d7dcafc0fa670fd87a2e129310640aac
25.472804
apache-2.0
16
60
true
true
false
true
9.815168
0.543153
54.315328
0.491666
27.409031
0.052115
5.21148
0.32802
10.402685
0.451667
15.625
0.45886
39.873301
false
false
2024-01-12
2024-10-28
0
Weyaxi/Bagel-Hermes-2x34B
Weyaxi_Bagel-Hermes-34B-Slerp_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Weyaxi/Bagel-Hermes-34B-Slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Weyaxi/Bagel-Hermes-34B-Slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Weyaxi__Bagel-Hermes-34B-Slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Weyaxi/Bagel-Hermes-34B-Slerp
dcdcc17a2c650a95bc27129a3ddbf261dffed37f
27.209093
apache-2.0
1
34
true
false
false
false
3.021375
0.460272
46.027208
0.59219
41.957047
0.058157
5.81571
0.334732
11.297539
0.462208
17.009375
0.470329
41.14768
true
false
2024-01-12
2024-08-30
0
Weyaxi/Bagel-Hermes-34B-Slerp
Weyaxi_Einstein-v4-7B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Weyaxi/Einstein-v4-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Weyaxi/Einstein-v4-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Weyaxi__Einstein-v4-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Weyaxi/Einstein-v4-7B
7eecd9833b8a012e23ac1df789884888b047baa0
16.768252
other
47
7
true
false
false
true
0.667754
0.470813
47.0813
0.384947
14.304451
0.019637
1.963746
0.281879
4.250559
0.468167
19.020833
0.225898
13.988623
false
false
2024-02-22
2024-06-26
1
mistralai/Mistral-7B-v0.1
Weyaxi_Einstein-v6.1-Llama3-8B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Weyaxi/Einstein-v6.1-Llama3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Weyaxi/Einstein-v6.1-Llama3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Weyaxi__Einstein-v6.1-Llama3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Weyaxi/Einstein-v6.1-Llama3-8B
5cab6d54666b6024d0f745d61abf1842edb934e0
20.119139
other
66
8
true
false
false
true
0.859598
0.456825
45.682456
0.50083
29.383773
0.064955
6.495468
0.281879
4.250559
0.421281
11.226823
0.313082
23.675754
false
false
2024-04-19
2024-06-26
1
meta-llama/Meta-Llama-3-8B
Weyaxi_Einstein-v6.1-developed-by-Weyaxi-Llama3-8B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Weyaxi/Einstein-v6.1-developed-by-Weyaxi-Llama3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Weyaxi/Einstein-v6.1-developed-by-Weyaxi-Llama3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Weyaxi__Einstein-v6.1-developed-by-Weyaxi-Llama3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Weyaxi/Einstein-v6.1-developed-by-Weyaxi-Llama3-8B
b7507e94146c0832c26609e9ab8115934d3e25b3
19.20545
other
1
8
true
false
false
true
0.871728
0.392702
39.270247
0.504384
29.694447
0.064955
6.495468
0.27349
3.131991
0.43325
13.389583
0.309259
23.25096
false
false
2024-06-23
2024-06-26
0
Weyaxi/Einstein-v6.1-developed-by-Weyaxi-Llama3-8B
Weyaxi_Einstein-v7-Qwen2-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Weyaxi/Einstein-v7-Qwen2-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Weyaxi/Einstein-v7-Qwen2-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Weyaxi__Einstein-v7-Qwen2-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Weyaxi/Einstein-v7-Qwen2-7B
d5a2f245bf98a40d196821bc378e10f35b4da81a
24.239953
other
35
7
true
false
false
true
1.313971
0.409963
40.996334
0.516147
32.841819
0.165408
16.540785
0.299497
6.599553
0.439979
14.064063
0.409574
34.397163
false
false
2024-06-24
2024-06-26
1
Qwen/Qwen2-7B
Weyaxi_Einstein-v8-Llama3.2-1B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Weyaxi/Einstein-v8-Llama3.2-1B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Weyaxi/Einstein-v8-Llama3.2-1B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Weyaxi__Einstein-v8-Llama3.2-1B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Weyaxi/Einstein-v8-Llama3.2-1B
1edc6abcb8eedd047bc40b79d2d36c3723ff28e2
4.627821
llama3.2
1
1
true
false
false
true
0.387925
0.186223
18.622256
0.301843
3.013774
0
0
0.258389
1.118568
0.361781
3.222656
0.116107
1.789672
false
false
2024-09-28
2024-09-30
1
meta-llama/Llama-3.2-1B
Weyaxi_SauerkrautLM-UNA-SOLAR-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Weyaxi__SauerkrautLM-UNA-SOLAR-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct
9678b9ca952abe0083dbfc772a56b849866bfa1a
19.708133
cc-by-nc-4.0
26
10
true
false
false
true
0.746011
0.457324
45.732434
0.516636
31.824687
0
0
0.311242
8.165548
0.397875
8.601042
0.315326
23.925089
true
false
2023-12-21
2024-06-26
0
Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct
WizardLMTeam_WizardLM-13B-V1.0_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/WizardLMTeam/WizardLM-13B-V1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">WizardLMTeam/WizardLM-13B-V1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/WizardLMTeam__WizardLM-13B-V1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
WizardLMTeam/WizardLM-13B-V1.0
964a93aa2e78da377115bb856075a69ebe8aefa4
4.546092
73
13
true
false
false
false
70.977587
0.185049
18.5049
0.291344
2.147967
0
0
0.259228
1.230425
0.349719
3.548177
0.116606
1.84508
false
true
2023-05-13
2024-06-13
0
WizardLMTeam/WizardLM-13B-V1.0
WizardLMTeam_WizardLM-13B-V1.2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/WizardLMTeam/WizardLM-13B-V1.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">WizardLMTeam/WizardLM-13B-V1.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/WizardLMTeam__WizardLM-13B-V1.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
WizardLMTeam/WizardLM-13B-V1.2
cf5f40382559f19e13874e45b39575171ca46ef8
15.164945
llama2
225
13
true
false
false
false
3.519458
0.339247
33.924653
0.4462
22.888655
0.018127
1.812689
0.260906
1.454139
0.437844
14.030469
0.251912
16.879063
false
true
2023-07-25
2024-06-12
0
WizardLMTeam/WizardLM-13B-V1.2
WizardLMTeam_WizardLM-70B-V1.0_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/WizardLMTeam/WizardLM-70B-V1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">WizardLMTeam/WizardLM-70B-V1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/WizardLMTeam__WizardLM-70B-V1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
WizardLMTeam/WizardLM-70B-V1.0
54aaecaff7d0790eb9f0ecea1cc267a94cc66949
22.359678
llama2
235
70
true
false
false
false
29.096063
0.495143
49.514289
0.559037
37.543355
0.037009
3.700906
0.26594
2.12528
0.439115
14.089323
0.344664
27.184914
false
true
2023-08-09
2024-06-12
0
WizardLMTeam/WizardLM-70B-V1.0
Xclbr7_Arcanum-12b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Xclbr7/Arcanum-12b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Xclbr7/Arcanum-12b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Xclbr7__Arcanum-12b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Xclbr7/Arcanum-12b
845ac67d2b527296ae8c06da4453bf8a60f2e59b
20.694285
mit
1
12
true
false
false
false
1.690511
0.290686
29.068649
0.526536
31.87996
0.115559
11.555891
0.32047
9.395973
0.417031
13.528906
0.358627
28.736333
false
false
2024-09-17
2024-09-17
0
Xclbr7/Arcanum-12b