eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
6 values
T
stringclasses
6 values
Weight type
stringclasses
2 values
Architecture
stringclasses
52 values
Model
stringlengths
355
689
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
1.03
52
Hub License
stringclasses
26 values
Hub ❤️
int64
0
5.9k
#Params (B)
int64
-1
140
Available on the hub
bool
2 classes
MoE
bool
2 classes
Flagged
bool
2 classes
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.03
107
IFEval Raw
float64
0
0.9
IFEval
float64
0
90
BBH Raw
float64
0.27
0.75
BBH
float64
0.81
63.5
MATH Lvl 5 Raw
float64
0
0.51
MATH Lvl 5
float64
0
50.7
GPQA Raw
float64
0.22
0.44
GPQA
float64
0
24.9
MUSR Raw
float64
0.29
0.6
MUSR
float64
0
38.5
MMLU-PRO Raw
float64
0.1
0.73
MMLU-PRO
float64
0
70
Merged
bool
2 classes
Official Providers
bool
2 classes
Upload To Hub Date
stringclasses
424 values
Submission Date
stringclasses
169 values
Generation
int64
0
10
Base Model
stringlengths
4
102
NotASI_FineTome-v1.5-Llama3.2-3B-1007_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NotASI/FineTome-v1.5-Llama3.2-3B-1007" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NotASI/FineTome-v1.5-Llama3.2-3B-1007</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NotASI__FineTome-v1.5-Llama3.2-3B-1007-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NotASI/FineTome-v1.5-Llama3.2-3B-1007
6c6e71fbcff6c00d04a3fd69084af20bf2a943c8
16.962639
llama3.2
1
3
true
false
false
true
0.725379
0.550772
55.077195
0.431237
19.457219
0.055136
5.513595
0.261745
1.565996
0.364542
4.067708
0.244847
16.094119
true
false
2024-10-07
2024-10-07
1
NotASI/FineTome-v1.5-Llama3.2-3B-1007 (Merge)
NousResearch_Hermes-2-Pro-Llama-3-8B_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NousResearch/Hermes-2-Pro-Llama-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NousResearch/Hermes-2-Pro-Llama-3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NousResearch__Hermes-2-Pro-Llama-3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NousResearch/Hermes-2-Pro-Llama-3-8B
bc265d1781299ed2045214289c927c207439a729
21.70492
llama3
411
8
true
false
false
true
0.749983
0.536184
53.618399
0.507113
30.667993
0.061934
6.193353
0.292785
5.704698
0.42624
11.246615
0.305186
22.798463
false
true
2024-04-30
2024-06-13
1
NousResearch/Meta-Llama-3-8B
NousResearch_Hermes-2-Pro-Mistral-7B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/NousResearch/Hermes-2-Pro-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NousResearch/Hermes-2-Pro-Mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NousResearch__Hermes-2-Pro-Mistral-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NousResearch/Hermes-2-Pro-Mistral-7B
09317b1d8da639b5d9af77c06aa17cde0f0f91c0
21.702108
apache-2.0
487
7
true
false
false
true
0.472798
0.566834
56.683378
0.499544
29.427579
0.052115
5.21148
0.27349
3.131991
0.437594
14.132552
0.294631
21.625665
false
true
2024-03-11
2024-06-12
1
mistralai/Mistral-7B-v0.1
NousResearch_Hermes-2-Theta-Llama-3-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NousResearch/Hermes-2-Theta-Llama-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NousResearch/Hermes-2-Theta-Llama-3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NousResearch__Hermes-2-Theta-Llama-3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NousResearch/Hermes-2-Theta-Llama-3-8B
885173e97ab8572b444f7db1290d5d0386e26816
24.775788
apache-2.0
195
8
true
false
false
true
0.743922
0.651788
65.178837
0.520667
32.046074
0.095921
9.592145
0.303691
7.158837
0.394896
8.361979
0.336852
26.316859
false
true
2024-05-05
2024-07-11
2
NousResearch/Meta-Llama-3-8B
NousResearch_Hermes-3-Llama-3.1-70B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NousResearch/Hermes-3-Llama-3.1-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NousResearch/Hermes-3-Llama-3.1-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NousResearch__Hermes-3-Llama-3.1-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NousResearch/Hermes-3-Llama-3.1-70B
093242c69a91f8d9d5b8094c380b88772f9bd7f8
37.482545
llama3
96
70
true
false
false
true
11.207891
0.766144
76.614383
0.675578
53.765409
0.148036
14.803625
0.361577
14.876957
0.494896
23.428646
0.472656
41.40625
false
true
2024-07-29
2024-08-28
1
meta-llama/Meta-Llama-3.1-70B
NousResearch_Hermes-3-Llama-3.1-8B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NousResearch/Hermes-3-Llama-3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NousResearch/Hermes-3-Llama-3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NousResearch__Hermes-3-Llama-3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NousResearch/Hermes-3-Llama-3.1-8B
aabb745a717e133b74dcae23195d2635cf5f38cc
23.490877
llama3
254
8
true
false
false
true
0.905808
0.617017
61.701729
0.517745
30.724097
0.047583
4.758308
0.297819
6.375839
0.436938
13.617187
0.313913
23.7681
false
true
2024-07-28
2024-08-28
1
meta-llama/Meta-Llama-3.1-8B
NousResearch_Nous-Hermes-2-Mistral-7B-DPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/NousResearch/Nous-Hermes-2-Mistral-7B-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NousResearch/Nous-Hermes-2-Mistral-7B-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NousResearch__Nous-Hermes-2-Mistral-7B-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NousResearch/Nous-Hermes-2-Mistral-7B-DPO
ebec0a691037d38955727d6949798429a63929dd
21.037646
apache-2.0
169
7
true
false
false
true
0.474599
0.576251
57.625101
0.485265
27.792546
0.043807
4.380665
0.292785
5.704698
0.399979
8.330729
0.301529
22.392139
false
true
2024-02-18
2024-06-12
1
mistralai/Mistral-7B-v0.1
NousResearch_Nous-Hermes-2-Mixtral-8x7B-DPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/NousResearch/Nous-Hermes-2-Mixtral-8x7B-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NousResearch/Nous-Hermes-2-Mixtral-8x7B-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NousResearch__Nous-Hermes-2-Mixtral-8x7B-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NousResearch/Nous-Hermes-2-Mixtral-8x7B-DPO
286ae6737d048ad1d965c2e830864df02db50f2f
27.29025
apache-2.0
420
46
true
true
false
true
12.865144
0.58969
58.96898
0.553885
37.107784
0.11858
11.858006
0.321309
9.50783
0.459542
16.676042
0.366606
29.622858
false
true
2024-01-11
2024-07-27
1
mistralai/Mixtral-8x7B-v0.1
NousResearch_Nous-Hermes-2-Mixtral-8x7B-SFT_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/NousResearch/Nous-Hermes-2-Mixtral-8x7B-SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NousResearch/Nous-Hermes-2-Mixtral-8x7B-SFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NousResearch__Nous-Hermes-2-Mixtral-8x7B-SFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NousResearch/Nous-Hermes-2-Mixtral-8x7B-SFT
4c06af2684730f75a6874b95e8bf6058105d9612
21.841011
apache-2.0
55
46
true
true
false
true
10.38794
0.573078
57.307832
0.505787
30.594313
0.021148
2.114804
0.302013
6.935123
0.421375
11.138542
0.306599
22.955452
false
true
2023-12-26
2024-06-12
1
mistralai/Mixtral-8x7B-v0.1
NousResearch_Nous-Hermes-2-SOLAR-10.7B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NousResearch/Nous-Hermes-2-SOLAR-10.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NousResearch/Nous-Hermes-2-SOLAR-10.7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NousResearch__Nous-Hermes-2-SOLAR-10.7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NousResearch/Nous-Hermes-2-SOLAR-10.7B
14c1fbe2f71acdcd58247b30d5439bd572d52386
23.362191
apache-2.0
204
10
true
false
false
true
0.643444
0.527866
52.786606
0.541429
34.990895
0.054381
5.438066
0.293624
5.816555
0.437281
13.826823
0.345828
27.314199
false
true
2024-01-01
2024-06-12
1
upstage/SOLAR-10.7B-v1.0
NousResearch_Nous-Hermes-llama-2-7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NousResearch/Nous-Hermes-llama-2-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NousResearch/Nous-Hermes-llama-2-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NousResearch__Nous-Hermes-llama-2-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NousResearch/Nous-Hermes-llama-2-7b
b7c3ec54b754175e006ef75696a2ba3802697078
9.29154
mit
68
6
true
false
false
false
2.558057
0.172908
17.290788
0.382394
13.78942
0.007553
0.755287
0.263423
1.789709
0.425719
11.68151
0.193983
10.442524
false
true
2023-07-25
2024-06-12
0
NousResearch/Nous-Hermes-llama-2-7b
NousResearch_Yarn-Llama-2-13b-128k_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NousResearch/Yarn-Llama-2-13b-128k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NousResearch/Yarn-Llama-2-13b-128k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NousResearch__Yarn-Llama-2-13b-128k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NousResearch/Yarn-Llama-2-13b-128k
4e3e87a067f64f8814c83dd5e3bad92dcf8a2391
8.418618
114
13
true
false
false
false
51.935783
0.165464
16.54643
0.382682
13.505319
0.01284
1.283988
0.258389
1.118568
0.34575
3.385417
0.232048
14.671986
false
true
2023-08-30
2024-06-13
0
NousResearch/Yarn-Llama-2-13b-128k
NousResearch_Yarn-Llama-2-7b-128k_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NousResearch/Yarn-Llama-2-7b-128k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NousResearch/Yarn-Llama-2-7b-128k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NousResearch__Yarn-Llama-2-7b-128k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NousResearch/Yarn-Llama-2-7b-128k
e1ceedbbf2ed28b88086794441a6c05606d15437
6.701508
39
7
true
false
false
false
0.839739
0.148478
14.847826
0.324803
6.144692
0.008308
0.830816
0.260067
1.342282
0.396698
8.253906
0.179106
8.789524
false
true
2023-08-31
2024-06-13
0
NousResearch/Yarn-Llama-2-7b-128k
NousResearch_Yarn-Llama-2-7b-64k_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NousResearch/Yarn-Llama-2-7b-64k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NousResearch/Yarn-Llama-2-7b-64k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NousResearch__Yarn-Llama-2-7b-64k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NousResearch/Yarn-Llama-2-7b-64k
08491431ac3b50add7443f5d4c02850801d877be
7.134766
23
7
true
false
false
false
0.830401
0.169986
16.998564
0.332628
7.044055
0.010574
1.057402
0.264262
1.901566
0.393875
6.934375
0.179854
8.872636
false
true
2023-08-30
2024-06-13
0
NousResearch/Yarn-Llama-2-7b-64k
NousResearch_Yarn-Mistral-7b-128k_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/NousResearch/Yarn-Mistral-7b-128k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NousResearch/Yarn-Mistral-7b-128k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NousResearch__Yarn-Mistral-7b-128k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NousResearch/Yarn-Mistral-7b-128k
d09f1f8ed437d61c1aff94c1beabee554843dcdd
13.218403
apache-2.0
572
7
true
false
false
false
0.550261
0.193367
19.336693
0.431447
20.633112
0.028701
2.870091
0.298658
6.487696
0.407052
8.948177
0.289312
21.034648
false
true
2023-10-31
2024-06-12
0
NousResearch/Yarn-Mistral-7b-128k
NousResearch_Yarn-Mistral-7b-64k_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/NousResearch/Yarn-Mistral-7b-64k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NousResearch/Yarn-Mistral-7b-64k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NousResearch__Yarn-Mistral-7b-64k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NousResearch/Yarn-Mistral-7b-64k
0273c624561fcecc8e8f4030492a9307aa60f945
13.502694
apache-2.0
51
7
true
false
false
false
0.54116
0.207955
20.795489
0.429319
20.2302
0.034743
3.47432
0.290268
5.369128
0.412385
9.88151
0.29139
21.265514
false
true
2023-10-31
2024-06-12
0
NousResearch/Yarn-Mistral-7b-64k
NousResearch_Yarn-Solar-10b-32k_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NousResearch/Yarn-Solar-10b-32k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NousResearch/Yarn-Solar-10b-32k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NousResearch__Yarn-Solar-10b-32k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NousResearch/Yarn-Solar-10b-32k
ec3158b5276ac6644ddbdb36ccf6f9a106c98ede
15.718665
apache-2.0
10
10
true
false
false
false
1.388438
0.194815
19.481531
0.498686
28.994824
0.029456
2.945619
0.302852
7.04698
0.414646
10.597396
0.327211
25.245641
false
true
2024-01-17
2024-06-12
0
NousResearch/Yarn-Solar-10b-32k
NousResearch_Yarn-Solar-10b-64k_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NousResearch/Yarn-Solar-10b-64k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NousResearch/Yarn-Solar-10b-64k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NousResearch__Yarn-Solar-10b-64k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NousResearch/Yarn-Solar-10b-64k
703818628a5e8ef637e48e8dbeb3662aa0497aff
15.124286
apache-2.0
15
10
true
false
false
false
0.763753
0.198887
19.888673
0.492199
28.395714
0.026435
2.643505
0.302013
6.935123
0.401438
9.013021
0.314827
23.869681
false
true
2024-01-17
2024-06-12
0
NousResearch/Yarn-Solar-10b-64k
NucleusAI_nucleus-22B-token-500B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/NucleusAI/nucleus-22B-token-500B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NucleusAI/nucleus-22B-token-500B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NucleusAI__nucleus-22B-token-500B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
NucleusAI/nucleus-22B-token-500B
49bb1a47c0d32b4bfa6630a4eff04a857adcd4ca
1.633416
mit
25
21
true
false
false
false
0.594818
0.025654
2.565415
0.29198
1.887999
0
0
0.25
0
0.351052
3.548177
0.11619
1.798907
false
false
2023-10-06
2024-06-26
0
NucleusAI/nucleus-22B-token-500B
OEvortex_HelpingAI-15B_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OEvortex/HelpingAI-15B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OEvortex/HelpingAI-15B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OEvortex__HelpingAI-15B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OEvortex/HelpingAI-15B
fcc5d4eeee08c07680a2560a302de3eaa5d6f550
4.515496
other
12
15
true
false
false
true
1.227237
0.203009
20.300913
0.293601
1.815381
0
0
0.25755
1.006711
0.361875
2.734375
0.11112
1.235594
false
false
2024-07-11
2024-07-13
0
OEvortex/HelpingAI-15B
OEvortex_HelpingAI-3B-reloaded_float16
float16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OEvortex/HelpingAI-3B-reloaded" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OEvortex/HelpingAI-3B-reloaded</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OEvortex__HelpingAI-3B-reloaded-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OEvortex/HelpingAI-3B-reloaded
aaee653fea06ba322e7a9ed15530db605cc3b382
14.592187
other
1
2
true
false
false
true
0.561626
0.464668
46.466819
0.412851
16.98574
0.003021
0.302115
0.263423
1.789709
0.352448
4.289323
0.259475
17.719415
false
false
2024-10-31
2024-10-31
0
OEvortex/HelpingAI-3B-reloaded
OEvortex_HelpingAI2-9B_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OEvortex/HelpingAI2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OEvortex/HelpingAI2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OEvortex__HelpingAI2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OEvortex/HelpingAI2-9B
b45a18cf41d0d438d71d79687e098ec60dd0aec1
17.418105
other
23
8
true
false
false
true
1.040651
0.441312
44.131238
0.484462
27.073242
0.047583
4.758308
0.258389
1.118568
0.371083
6.31875
0.289977
21.108525
false
false
2024-08-16
2024-10-11
0
OEvortex/HelpingAI2-9B
OEvortex_HelpingAI2.5-10B_float16
float16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OEvortex/HelpingAI2.5-10B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OEvortex/HelpingAI2.5-10B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OEvortex__HelpingAI2.5-10B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OEvortex/HelpingAI2.5-10B
25ac750b886c7e42521c769e6c2cd2b1143cfbcc
13.371895
other
2
10
true
false
false
true
0.939393
0.327656
32.765617
0.449566
21.135366
0
0
0.269295
2.572707
0.373813
6.259896
0.25748
17.497784
false
false
2024-11-17
2024-11-19
0
OEvortex/HelpingAI2.5-10B
OliveiraJLT_Sagui-7B-Instruct-v0.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OliveiraJLT/Sagui-7B-Instruct-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OliveiraJLT/Sagui-7B-Instruct-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OliveiraJLT__Sagui-7B-Instruct-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OliveiraJLT/Sagui-7B-Instruct-v0.1
e3032ba89a6df12b801ab3be2a29b59068aa048d
8.390586
0
6
false
false
false
true
1.070938
0.289163
28.916275
0.311068
5.043572
0.003776
0.377644
0.24245
0
0.419052
10.614844
0.148521
5.391179
false
false
2024-07-18
0
Removed
Omkar1102_code-yi_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Omkar1102/code-yi" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Omkar1102/code-yi</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Omkar1102__code-yi-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Omkar1102/code-yi
7e875c1d64029d1f8db6813bd2b715cb5406b745
4.921767
0
2
false
false
false
false
0.44252
0.214775
21.477458
0.276006
1.844158
0
0
0.250839
0.111857
0.380229
4.695313
0.112616
1.401817
false
false
2024-11-16
0
Removed
Omkar1102_code-yi_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Omkar1102/code-yi" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Omkar1102/code-yi</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Omkar1102__code-yi-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Omkar1102/code-yi
7e875c1d64029d1f8db6813bd2b715cb5406b745
5.1703
0
2
false
false
false
false
0.851172
0.225441
22.544072
0.275003
1.581399
0
0
0.25755
1.006711
0.376198
4.52474
0.112284
1.364879
false
false
2024-11-16
0
Removed
OmnicromsBrain_NeuralStar_FusionWriter_4x7b_float16
float16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/OmnicromsBrain/NeuralStar_FusionWriter_4x7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OmnicromsBrain/NeuralStar_FusionWriter_4x7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OmnicromsBrain__NeuralStar_FusionWriter_4x7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OmnicromsBrain/NeuralStar_FusionWriter_4x7b
fbe296d2c76acbb792cdd22e14d1c8bb13723839
20.071645
apache-2.0
5
24
true
true
false
true
1.376465
0.596384
59.638426
0.477624
26.03844
0.049094
4.909366
0.278523
3.803132
0.401875
8.201042
0.260555
17.839465
true
false
2024-06-07
2024-07-01
1
OmnicromsBrain/NeuralStar_FusionWriter_4x7b (Merge)
Open-Orca_Mistral-7B-OpenOrca_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Open-Orca/Mistral-7B-OpenOrca" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Open-Orca/Mistral-7B-OpenOrca</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Open-Orca__Mistral-7B-OpenOrca-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Open-Orca/Mistral-7B-OpenOrca
4a37328cef00f524d3791b1c0cc559a3cc6af14d
17.696475
apache-2.0
674
7
true
false
false
true
0.53358
0.497766
49.776593
0.476817
25.840025
0.033988
3.398792
0.271812
2.908277
0.385781
5.889323
0.265293
18.365839
false
true
2023-09-29
2024-06-12
0
Open-Orca/Mistral-7B-OpenOrca
OpenAssistant_oasst-sft-1-pythia-12b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/OpenAssistant/oasst-sft-1-pythia-12b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenAssistant/oasst-sft-1-pythia-12b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenAssistant__oasst-sft-1-pythia-12b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenAssistant/oasst-sft-1-pythia-12b
293df535fe7711a5726987fc2f17dfc87de452a1
3.669242
apache-2.0
278
12
true
false
false
false
0.888057
0.105539
10.553886
0.314663
4.778509
0.01435
1.435045
0.25755
1.006711
0.332698
2.98724
0.111287
1.254063
false
true
2023-03-09
2024-06-12
0
OpenAssistant/oasst-sft-1-pythia-12b
OpenBuddy_openbuddy-llama3-70b-v21.2-32k_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OpenBuddy/openbuddy-llama3-70b-v21.2-32k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenBuddy/openbuddy-llama3-70b-v21.2-32k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenBuddy__openbuddy-llama3-70b-v21.2-32k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenBuddy/openbuddy-llama3-70b-v21.2-32k
e79a2f16c052fc76eeafb5b51d16261b2b981d0f
35.465341
other
1
70
true
false
false
true
13.035936
0.701048
70.104766
0.650744
49.969366
0.197885
19.78852
0.342282
12.304251
0.457969
18.046094
0.483211
42.579048
false
false
2024-06-12
2024-09-05
0
OpenBuddy/openbuddy-llama3-70b-v21.2-32k
OpenBuddy_openbuddy-llama3-8b-v21.1-8k_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OpenBuddy/openbuddy-llama3-8b-v21.1-8k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenBuddy/openbuddy-llama3-8b-v21.1-8k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenBuddy__openbuddy-llama3-8b-v21.1-8k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenBuddy/openbuddy-llama3-8b-v21.1-8k
658508bce03ccd61cea9657e0357bd4cd10503ba
19.961528
other
30
8
true
false
false
true
0.819422
0.556967
55.696663
0.47875
26.115045
0.030967
3.096677
0.270973
2.796421
0.398771
10.346354
0.295462
21.718011
false
false
2024-04-20
2024-08-03
0
OpenBuddy/openbuddy-llama3-8b-v21.1-8k
OpenBuddy_openbuddy-llama3-8b-v21.2-32k_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OpenBuddy/openbuddy-llama3-8b-v21.2-32k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenBuddy/openbuddy-llama3-8b-v21.2-32k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenBuddy__openbuddy-llama3-8b-v21.2-32k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenBuddy/openbuddy-llama3-8b-v21.2-32k
f3ea2dec2533a3dd97df32db2376b17875cafda2
21.905834
other
0
8
true
false
false
true
0.849889
0.61919
61.919041
0.485622
27.252335
0.068731
6.873112
0.279362
3.914989
0.377875
5.934375
0.32987
25.54115
false
false
2024-06-18
2024-06-26
0
OpenBuddy/openbuddy-llama3-8b-v21.2-32k
OpenBuddy_openbuddy-llama3.1-70b-v22.1-131k_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OpenBuddy/openbuddy-llama3.1-70b-v22.1-131k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenBuddy/openbuddy-llama3.1-70b-v22.1-131k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenBuddy__openbuddy-llama3.1-70b-v22.1-131k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenBuddy/openbuddy-llama3.1-70b-v22.1-131k
43ed945180174d79a8f6c68509161c249c884dfa
35.307882
other
1
70
true
false
false
true
12.191623
0.733271
73.327105
0.669849
51.940776
0.03852
3.851964
0.375
16.666667
0.462958
18.236458
0.530419
47.82432
false
false
2024-08-21
2024-08-24
0
OpenBuddy/openbuddy-llama3.1-70b-v22.1-131k
OpenBuddy_openbuddy-llama3.1-8b-v22.2-131k_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OpenBuddy/openbuddy-llama3.1-8b-v22.2-131k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenBuddy/openbuddy-llama3.1-8b-v22.2-131k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenBuddy__openbuddy-llama3.1-8b-v22.2-131k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenBuddy/openbuddy-llama3.1-8b-v22.2-131k
0d9d85c7a5e4292e07c346147de56bd3991d525c
24.254528
other
2
8
true
false
false
true
0.791063
0.665727
66.572694
0.500652
29.057538
0.104985
10.498489
0.279362
3.914989
0.408104
9.813021
0.331034
25.670434
false
false
2024-07-28
2024-07-29
0
OpenBuddy/openbuddy-llama3.1-8b-v22.2-131k
OpenBuddy_openbuddy-llama3.1-8b-v22.3-131k_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OpenBuddy/openbuddy-llama3.1-8b-v22.3-131k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenBuddy/openbuddy-llama3.1-8b-v22.3-131k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenBuddy__openbuddy-llama3.1-8b-v22.3-131k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenBuddy/openbuddy-llama3.1-8b-v22.3-131k
0097358fa1a450251b7ea1a03a5effdfded6c461
23.07878
other
2
8
true
false
false
true
0.834223
0.599707
59.970656
0.506591
30.319511
0.106495
10.649547
0.279362
3.914989
0.401469
8.316927
0.327709
25.301049
false
false
2024-08-16
2024-08-24
0
OpenBuddy/openbuddy-llama3.1-8b-v22.3-131k
OpenBuddy_openbuddy-llama3.2-1b-v23.1-131k_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OpenBuddy/openbuddy-llama3.2-1b-v23.1-131k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenBuddy/openbuddy-llama3.2-1b-v23.1-131k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenBuddy__openbuddy-llama3.2-1b-v23.1-131k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenBuddy/openbuddy-llama3.2-1b-v23.1-131k
71b61e0e02e55553902f0051074d2ae965413cdb
8.959802
llama3.2
3
1
true
false
false
true
0.446143
0.359005
35.900522
0.326656
6.04362
0.001511
0.151057
0.258389
1.118568
0.334219
1.210677
0.184009
9.334368
false
false
2024-10-07
2024-10-09
0
OpenBuddy/openbuddy-llama3.2-1b-v23.1-131k
OpenBuddy_openbuddy-llama3.2-3b-v23.2-131k_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OpenBuddy/openbuddy-llama3.2-3b-v23.2-131k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenBuddy/openbuddy-llama3.2-3b-v23.2-131k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenBuddy__openbuddy-llama3.2-3b-v23.2-131k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenBuddy/openbuddy-llama3.2-3b-v23.2-131k
7cd2baa3d9bb99e970d711fb7afe786753bc25ea
13.394834
llama3.2
0
3
true
false
false
true
0.695572
0.431945
43.194502
0.407266
16.588826
0.002266
0.226586
0.276007
3.467562
0.326313
0.455729
0.247922
16.435801
false
false
2024-10-14
2024-10-15
0
OpenBuddy/openbuddy-llama3.2-3b-v23.2-131k
OpenBuddy_openbuddy-mixtral-7bx8-v18.1-32k_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/OpenBuddy/openbuddy-mixtral-7bx8-v18.1-32k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenBuddy/openbuddy-mixtral-7bx8-v18.1-32k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenBuddy__openbuddy-mixtral-7bx8-v18.1-32k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenBuddy/openbuddy-mixtral-7bx8-v18.1-32k
98596b6731058cc9cca85f3b8ac9077342cb60ae
22.229104
apache-2.0
14
46
true
true
false
true
4.868876
0.549348
54.934795
0.465618
24.535443
0.101964
10.196375
0.30453
7.270694
0.383052
5.28151
0.380402
31.155807
false
false
2024-02-12
2024-06-26
0
OpenBuddy/openbuddy-mixtral-7bx8-v18.1-32k
OpenBuddy_openbuddy-nemotron-70b-v23.1-131k_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OpenBuddy/openbuddy-nemotron-70b-v23.1-131k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenBuddy/openbuddy-nemotron-70b-v23.1-131k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenBuddy__openbuddy-nemotron-70b-v23.1-131k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenBuddy/openbuddy-nemotron-70b-v23.1-131k
d8cb98fb9281a84eb0df8216bae60beaf5181921
39.080117
llama3.1
3
70
true
false
false
true
24.294955
0.755528
75.552756
0.674947
53.188049
0.278701
27.870091
0.363255
15.100671
0.45375
16.385417
0.517453
46.383717
false
false
2024-10-20
2024-10-23
3
meta-llama/Meta-Llama-3.1-70B
OpenBuddy_openbuddy-nemotron-70b-v23.2-131k_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OpenBuddy/openbuddy-nemotron-70b-v23.2-131k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenBuddy/openbuddy-nemotron-70b-v23.2-131k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenBuddy__openbuddy-nemotron-70b-v23.2-131k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenBuddy/openbuddy-nemotron-70b-v23.2-131k
7a39fd93b078189c6892344c2f01059320543e2f
38.516599
llama3.1
1
70
true
false
false
true
12.394908
0.722655
72.265478
0.670481
52.265662
0.272659
27.265861
0.359899
14.653244
0.469594
18.865885
0.512051
45.783466
false
false
2024-10-24
2024-10-24
3
meta-llama/Meta-Llama-3.1-70B
OpenBuddy_openbuddy-qwen2.5llamaify-14b-v23.1-200k_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OpenBuddy/openbuddy-qwen2.5llamaify-14b-v23.1-200k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenBuddy/openbuddy-qwen2.5llamaify-14b-v23.1-200k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenBuddy__openbuddy-qwen2.5llamaify-14b-v23.1-200k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenBuddy/openbuddy-qwen2.5llamaify-14b-v23.1-200k
001e14063e2702a9b2284dc6ec889d2586dc839b
31.042901
apache-2.0
0
14
true
false
false
true
1.45675
0.630881
63.088051
0.60132
43.276499
0.164653
16.465257
0.333054
11.073826
0.424042
11.538542
0.467337
40.815233
false
false
2024-09-23
2024-09-23
0
OpenBuddy/openbuddy-qwen2.5llamaify-14b-v23.1-200k
OpenBuddy_openbuddy-qwen2.5llamaify-14b-v23.3-200k_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OpenBuddy/openbuddy-qwen2.5llamaify-14b-v23.3-200k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenBuddy/openbuddy-qwen2.5llamaify-14b-v23.3-200k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenBuddy__openbuddy-qwen2.5llamaify-14b-v23.3-200k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenBuddy/openbuddy-qwen2.5llamaify-14b-v23.3-200k
0cef6f7719c1eb3bc1ebba133508c2c6d67e635c
28.848771
apache-2.0
4
14
true
false
false
true
1.497949
0.613145
61.314534
0.608086
44.18394
0.024169
2.416918
0.327181
10.290828
0.434583
12.722917
0.479471
42.16349
false
false
2024-10-02
2024-10-11
0
OpenBuddy/openbuddy-qwen2.5llamaify-14b-v23.3-200k
OpenBuddy_openbuddy-qwen2.5llamaify-7b-v23.1-200k_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OpenBuddy/openbuddy-qwen2.5llamaify-7b-v23.1-200k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenBuddy/openbuddy-qwen2.5llamaify-7b-v23.1-200k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenBuddy__openbuddy-qwen2.5llamaify-7b-v23.1-200k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenBuddy/openbuddy-qwen2.5llamaify-7b-v23.1-200k
91521abfec2a00f4853f6cb4dd620177617ca572
26.843617
apache-2.0
0
7
true
false
false
true
1.843131
0.567258
56.725821
0.550938
36.398128
0.127644
12.76435
0.314597
8.612975
0.436323
13.807031
0.394781
32.753398
false
false
2024-10-04
2024-10-10
2
Qwen/Qwen2.5-7B
OpenBuddy_openbuddy-yi1.5-34b-v21.3-32k_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OpenBuddy/openbuddy-yi1.5-34b-v21.3-32k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenBuddy/openbuddy-yi1.5-34b-v21.3-32k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenBuddy__openbuddy-yi1.5-34b-v21.3-32k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenBuddy/openbuddy-yi1.5-34b-v21.3-32k
966be6ad502cdd50a9af94d5f003aec040cdb0b5
30.295298
apache-2.0
0
34
true
true
false
true
3.038327
0.542004
54.20041
0.616257
45.637093
0.140483
14.048338
0.348993
13.199105
0.443948
14.69349
0.45994
39.993351
false
false
2024-06-05
2024-08-30
0
OpenBuddy/openbuddy-yi1.5-34b-v21.3-32k
OpenBuddy_openbuddy-zero-14b-v22.3-32k_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OpenBuddy/openbuddy-zero-14b-v22.3-32k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenBuddy/openbuddy-zero-14b-v22.3-32k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenBuddy__openbuddy-zero-14b-v22.3-32k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenBuddy/openbuddy-zero-14b-v22.3-32k
d9a0b6bc02f283e154c9ad6db43a5a97eed97f5b
19.267602
other
1
14
true
false
false
true
1.688769
0.375292
37.5292
0.485976
26.289507
0.085347
8.534743
0.307047
7.606264
0.416604
11.342188
0.318733
24.303709
false
false
2024-07-16
2024-07-29
0
OpenBuddy/openbuddy-zero-14b-v22.3-32k
OpenBuddy_openbuddy-zero-3b-v21.2-32k_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OpenBuddy/openbuddy-zero-3b-v21.2-32k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenBuddy/openbuddy-zero-3b-v21.2-32k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenBuddy__openbuddy-zero-3b-v21.2-32k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenBuddy/openbuddy-zero-3b-v21.2-32k
74e1d168c5e917219d668d1483f6355dd0464a31
11.549657
other
2
4
true
false
false
true
0.878619
0.380238
38.023777
0.393479
15.293406
0.009063
0.906344
0.260067
1.342282
0.356635
2.246094
0.203374
11.486037
false
false
2024-06-02
2024-06-26
0
OpenBuddy/openbuddy-zero-3b-v21.2-32k
OpenBuddy_openbuddy-zero-56b-v21.2-32k_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OpenBuddy/openbuddy-zero-56b-v21.2-32k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenBuddy/openbuddy-zero-56b-v21.2-32k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenBuddy__openbuddy-zero-56b-v21.2-32k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenBuddy/openbuddy-zero-56b-v21.2-32k
c7a1a4a6e798f75d1d3219ab9ff9f2692e29f7d5
28.233905
other
0
56
true
false
false
true
7.573746
0.505709
50.57093
0.612835
44.796542
0.14426
14.425982
0.317953
9.060403
0.430521
12.781771
0.43991
37.767804
false
false
2024-06-10
2024-06-26
0
OpenBuddy/openbuddy-zero-56b-v21.2-32k
OpenLeecher_llama3-8b-lima_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OpenLeecher/llama3-8b-lima" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenLeecher/llama3-8b-lima</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenLeecher__llama3-8b-lima-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenLeecher/llama3-8b-lima
237a2bcb240eecd9355a091f839e42ba3d31bda5
14.761082
0
8
false
false
false
true
0.958929
0.437066
43.706587
0.429583
19.573065
0.034743
3.47432
0.238255
0
0.371271
3.742187
0.262633
18.070331
false
false
2024-10-01
0
Removed
OpenScholar_Llama-3.1_OpenScholar-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/OpenScholar/Llama-3.1_OpenScholar-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenScholar/Llama-3.1_OpenScholar-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/OpenScholar__Llama-3.1_OpenScholar-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
OpenScholar/Llama-3.1_OpenScholar-8B
e26aeb22af568bd8d01ffde86ebbd13c3cf4fcc5
25.583689
apache-2.0
52
8
true
false
false
true
0.632607
0.606401
60.640102
0.520774
32.403921
0.142749
14.274924
0.281879
4.250559
0.42751
11.838802
0.370844
30.093824
false
false
2024-11-15
2024-12-03
1
OpenScholar/Llama-3.1_OpenScholar-8B (Merge)
Orenguteng_Llama-3.1-8B-Lexi-Uncensored_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Orenguteng/Llama-3.1-8B-Lexi-Uncensored" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Orenguteng/Llama-3.1-8B-Lexi-Uncensored</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Orenguteng__Llama-3.1-8B-Lexi-Uncensored-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Orenguteng/Llama-3.1-8B-Lexi-Uncensored
56ac439ab4c7826871493ffbe2d49f2100a98e97
26.860413
llama3.1
41
8
true
false
false
true
0.856735
0.777684
77.768432
0.505726
29.242543
0.138218
13.821752
0.271812
2.908277
0.387115
6.422656
0.378989
30.998818
false
false
2024-07-26
2024-07-29
0
Orenguteng/Llama-3.1-8B-Lexi-Uncensored
Orenguteng_Llama-3.1-8B-Lexi-Uncensored-V2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Orenguteng/Llama-3.1-8B-Lexi-Uncensored-V2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Orenguteng/Llama-3.1-8B-Lexi-Uncensored-V2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Orenguteng__Llama-3.1-8B-Lexi-Uncensored-V2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Orenguteng/Llama-3.1-8B-Lexi-Uncensored-V2
2340f8fbcd2452125a798686ca90b882a08fb0d9
27.925121
llama3.1
115
8
true
false
false
true
0.869686
0.779158
77.915819
0.508401
29.687033
0.169184
16.918429
0.282718
4.362416
0.384292
7.769792
0.378075
30.897237
false
false
2024-08-09
2024-08-28
0
Orenguteng/Llama-3.1-8B-Lexi-Uncensored-V2
Orion-zhen_Qwen2.5-7B-Instruct-Uncensored_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Orion-zhen/Qwen2.5-7B-Instruct-Uncensored" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Orion-zhen/Qwen2.5-7B-Instruct-Uncensored</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Orion-zhen__Qwen2.5-7B-Instruct-Uncensored-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Orion-zhen/Qwen2.5-7B-Instruct-Uncensored
33c24657b4394fc430ad90b5d413e5985ce8e292
27.989712
gpl-3.0
12
7
true
false
false
true
1.116812
0.720432
72.043179
0.547392
35.832453
0.013595
1.359517
0.302852
7.04698
0.436135
13.583594
0.442653
38.072547
false
false
2024-09-26
2024-10-19
1
Orion-zhen/Qwen2.5-7B-Instruct-Uncensored (Merge)
P0x0_Astra-v1-12B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/P0x0/Astra-v1-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">P0x0/Astra-v1-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/P0x0__Astra-v1-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
P0x0/Astra-v1-12B
c706e253f8d8fa838b505cbec0e1a6aeec545abc
19.6743
apache-2.0
2
12
true
false
false
false
1.605673
0.280594
28.059438
0.521451
31.809907
0.109517
10.951662
0.313758
8.501119
0.405188
11.381771
0.346077
27.341903
false
false
2024-09-21
2024-09-23
1
mistralai/Mistral-Nemo-Base-2407
PJMixers_LLaMa-3-CursedStock-v2.0-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/PJMixers/LLaMa-3-CursedStock-v2.0-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">PJMixers/LLaMa-3-CursedStock-v2.0-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/PJMixers__LLaMa-3-CursedStock-v2.0-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
PJMixers/LLaMa-3-CursedStock-v2.0-8B
d47cc29df363f71ffaf6cd21ac4bdeefa27359db
24.203898
llama3
10
8
true
false
false
true
1.402692
0.633079
63.307912
0.527116
32.563612
0.096677
9.667674
0.274329
3.243848
0.385625
8.036458
0.355635
28.403886
true
false
2024-06-26
2024-06-27
1
PJMixers/LLaMa-3-CursedStock-v2.0-8B (Merge)
PJMixers-Dev_LLaMa-3.2-Instruct-JankMix-v0.1-SFT-3B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/PJMixers-Dev/LLaMa-3.2-Instruct-JankMix-v0.1-SFT-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">PJMixers-Dev/LLaMa-3.2-Instruct-JankMix-v0.1-SFT-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/PJMixers-Dev__LLaMa-3.2-Instruct-JankMix-v0.1-SFT-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
PJMixers-Dev/LLaMa-3.2-Instruct-JankMix-v0.1-SFT-3B
1286f51489b06fe67fa36d57aa87331fa37e698b
22.626211
llama3.2
0
3
true
false
false
true
0.713694
0.693054
69.305443
0.455617
23.808307
0.117069
11.706949
0.274329
3.243848
0.370031
4.053906
0.312749
23.638815
false
false
2024-10-12
2024-10-12
1
PJMixers-Dev/LLaMa-3.2-Instruct-JankMix-v0.1-SFT-3B (Merge)
PJMixers-Dev_LLaMa-3.2-Instruct-JankMix-v0.2-SFT-3B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/PJMixers-Dev/LLaMa-3.2-Instruct-JankMix-v0.2-SFT-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">PJMixers-Dev/LLaMa-3.2-Instruct-JankMix-v0.2-SFT-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/PJMixers-Dev__LLaMa-3.2-Instruct-JankMix-v0.2-SFT-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
PJMixers-Dev/LLaMa-3.2-Instruct-JankMix-v0.2-SFT-3B
4c348a8dfc1be0b4985e0ed2882329515a60c19d
21.671969
llama3.2
0
3
true
false
false
true
0.709899
0.629157
62.91573
0.45815
23.34124
0.123867
12.386707
0.272651
3.020134
0.365875
4.867708
0.311503
23.500296
false
false
2024-10-14
2024-10-14
1
PJMixers-Dev/LLaMa-3.2-Instruct-JankMix-v0.2-SFT-3B (Merge)
PJMixers-Dev_LLaMa-3.2-Instruct-JankMix-v0.2-SFT-HailMary-v0.1-KTO-3B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/PJMixers-Dev/LLaMa-3.2-Instruct-JankMix-v0.2-SFT-HailMary-v0.1-KTO-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">PJMixers-Dev/LLaMa-3.2-Instruct-JankMix-v0.2-SFT-HailMary-v0.1-KTO-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/PJMixers-Dev__LLaMa-3.2-Instruct-JankMix-v0.2-SFT-HailMary-v0.1-KTO-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
PJMixers-Dev/LLaMa-3.2-Instruct-JankMix-v0.2-SFT-HailMary-v0.1-KTO-3B
17b245cfcffcc6aadc90989bf08d9625455064e1
21.687798
llama3.2
0
3
true
false
false
true
0.679042
0.65039
65.038985
0.451079
22.288715
0.117825
11.782477
0.271812
2.908277
0.368729
4.691146
0.310755
23.417184
false
false
2024-10-28
2024-10-28
1
PJMixers-Dev/LLaMa-3.2-Instruct-JankMix-v0.2-SFT-HailMary-v0.1-KTO-3B (Merge)
PJMixers-Dev_LLaMa-3.2-Instruct-JankMixBread-v0.1-3B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/PJMixers-Dev/LLaMa-3.2-Instruct-JankMixBread-v0.1-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">PJMixers-Dev/LLaMa-3.2-Instruct-JankMixBread-v0.1-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/PJMixers-Dev__LLaMa-3.2-Instruct-JankMixBread-v0.1-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
PJMixers-Dev/LLaMa-3.2-Instruct-JankMixBread-v0.1-3B
19faf7463cab41a2492cad26fc54b2fce3a05caf
19.57365
llama3.2
0
3
true
false
false
true
0.702428
0.504086
50.408583
0.448316
22.759588
0.120846
12.084592
0.282718
4.362416
0.351552
4.677344
0.308344
23.149379
true
false
2024-10-12
2024-10-12
1
PJMixers-Dev/LLaMa-3.2-Instruct-JankMixBread-v0.1-3B (Merge)
Parissa3_test-model_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Parissa3/test-model" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Parissa3/test-model</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Parissa3__test-model-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Parissa3/test-model
7021138dac98d930f1ce0ebe186583c0813d6f48
20.758506
0
7
false
false
false
false
0.473353
0.388256
38.825649
0.519392
32.839032
0.06571
6.570997
0.294463
5.928412
0.468531
17.533073
0.305685
22.853871
false
false
2024-11-16
2024-11-16
1
Parissa3/test-model (Merge)
PocketDoc_Dans-Instruct-CoreCurriculum-12b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/PocketDoc/Dans-Instruct-CoreCurriculum-12b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">PocketDoc/Dans-Instruct-CoreCurriculum-12b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/PocketDoc__Dans-Instruct-CoreCurriculum-12b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
PocketDoc/Dans-Instruct-CoreCurriculum-12b
c50db5ba880b7edc0efd32a7f3b9d2f051c3f4a6
9.402824
0
12
false
false
false
true
1.588538
0.219145
21.91452
0.378874
13.232565
0.049094
4.909366
0.282718
4.362416
0.409563
9.561979
0.121925
2.436096
false
false
2024-09-01
0
Removed
PocketDoc_Dans-PersonalityEngine-v1.0.0-8b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/PocketDoc/Dans-PersonalityEngine-v1.0.0-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">PocketDoc/Dans-PersonalityEngine-v1.0.0-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/PocketDoc__Dans-PersonalityEngine-v1.0.0-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
PocketDoc/Dans-PersonalityEngine-v1.0.0-8b
c64612e1eee1ddb3aa064a25eba8921ec3d94325
18.77942
apache-2.0
4
8
true
false
false
true
0.900974
0.49819
49.819036
0.473255
25.68796
0.055891
5.589124
0.285235
4.697987
0.354156
3.936198
0.306516
22.946217
false
false
2024-10-08
2024-10-08
1
PocketDoc/Dans-PersonalityEngine-v1.0.0-8b (Merge)
PranavHarshan_LaMistral-V4_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/PranavHarshan/LaMistral-V4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">PranavHarshan/LaMistral-V4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/PranavHarshan__LaMistral-V4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
PranavHarshan/LaMistral-V4
b373c2a1ab08823b6b119899f807793c96ef7888
24.210765
apache-2.0
1
8
true
false
false
true
0.697232
0.623861
62.386135
0.518426
31.091349
0.068731
6.873112
0.32802
10.402685
0.364292
5.636458
0.359874
28.874852
true
false
2024-10-01
2024-10-05
1
PranavHarshan/LaMistral-V4 (Merge)
PranavHarshan_MedNarra-X1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/PranavHarshan/MedNarra-X1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">PranavHarshan/MedNarra-X1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/PranavHarshan__MedNarra-X1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
PranavHarshan/MedNarra-X1
9fe294e7fd69ec56f0b7fa1a23759eed070f44bf
18.128682
0
8
false
false
false
false
0.676161
0.433843
43.384331
0.463717
23.523495
0.046828
4.682779
0.307886
7.718121
0.354031
2.453906
0.343085
27.009456
false
false
2024-10-08
2024-10-09
1
PranavHarshan/MedNarra-X1 (Merge)
Pretergeek_OpenChat-3.5-0106_10.7B_48Layers-Appended_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Pretergeek/OpenChat-3.5-0106_10.7B_48Layers-Appended" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Pretergeek/OpenChat-3.5-0106_10.7B_48Layers-Appended</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Pretergeek__OpenChat-3.5-0106_10.7B_48Layers-Appended-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Pretergeek/OpenChat-3.5-0106_10.7B_48Layers-Appended
1091b30480f4cc91f26cb1bd7579e527f490f8d2
22.7107
apache-2.0
2
10
true
false
false
true
0.8361
0.59606
59.605957
0.461964
24.057173
0.077795
7.779456
0.307047
7.606264
0.425406
11.775781
0.328956
25.439569
true
false
2024-07-27
2024-07-31
1
Pretergeek/OpenChat-3.5-0106_10.7B_48Layers-Appended (Merge)
Pretergeek_OpenChat-3.5-0106_10.7B_48Layers-Interleaved_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Pretergeek/OpenChat-3.5-0106_10.7B_48Layers-Interleaved" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Pretergeek/OpenChat-3.5-0106_10.7B_48Layers-Interleaved</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Pretergeek__OpenChat-3.5-0106_10.7B_48Layers-Interleaved-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Pretergeek/OpenChat-3.5-0106_10.7B_48Layers-Interleaved
dd6bd9a8a9a2223a02a4e8aa6270accbc8d4d81a
22.659113
apache-2.0
2
10
true
false
false
true
0.836592
0.59606
59.605957
0.461964
24.057173
0.077039
7.703927
0.30453
7.270694
0.425406
11.775781
0.32987
25.54115
true
false
2024-08-10
2024-08-16
1
Pretergeek/OpenChat-3.5-0106_10.7B_48Layers-Interleaved (Merge)
Pretergeek_OpenChat-3.5-0106_32K-PoSE_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Pretergeek/OpenChat-3.5-0106_32K-PoSE" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Pretergeek/OpenChat-3.5-0106_32K-PoSE</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Pretergeek__OpenChat-3.5-0106_32K-PoSE-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Pretergeek/OpenChat-3.5-0106_32K-PoSE
da6a73abac7fba68f1df4d42485d79553e97bf91
12.70227
apache-2.0
4
7
true
false
false
true
0.460399
0.396899
39.689912
0.347131
8.828395
0.01435
1.435045
0.276007
3.467562
0.420542
11.334375
0.203125
11.458333
false
false
2024-11-02
2024-11-02
1
Pretergeek/OpenChat-3.5-0106_32K-PoSE (Merge)
Pretergeek_OpenChat-3.5-0106_8.11B_36Layers-Appended_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Pretergeek/OpenChat-3.5-0106_8.11B_36Layers-Appended" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Pretergeek/OpenChat-3.5-0106_8.11B_36Layers-Appended</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Pretergeek__OpenChat-3.5-0106_8.11B_36Layers-Appended-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Pretergeek/OpenChat-3.5-0106_8.11B_36Layers-Appended
e957847e013bdd2f6e852b8a1c369ddce92fca78
22.736095
apache-2.0
2
8
true
false
false
true
0.683707
0.597583
59.75833
0.461964
24.057173
0.077795
7.779456
0.307047
7.606264
0.425406
11.775781
0.328956
25.439569
true
false
2024-07-26
2024-07-27
1
Pretergeek/OpenChat-3.5-0106_8.11B_36Layers-Appended (Merge)
Pretergeek_OpenChat-3.5-0106_8.11B_36Layers-Interleaved_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Pretergeek/OpenChat-3.5-0106_8.11B_36Layers-Interleaved" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Pretergeek/OpenChat-3.5-0106_8.11B_36Layers-Interleaved</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Pretergeek__OpenChat-3.5-0106_8.11B_36Layers-Interleaved-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Pretergeek/OpenChat-3.5-0106_8.11B_36Layers-Interleaved
485ebe835c6c001af0a1a6e0e40aab27bc195842
22.617725
apache-2.0
2
8
true
false
false
true
0.660455
0.59606
59.605957
0.46213
24.075506
0.077039
7.703927
0.30453
7.270694
0.424073
11.509115
0.32987
25.54115
true
false
2024-08-10
2024-08-16
1
Pretergeek/OpenChat-3.5-0106_8.11B_36Layers-Interleaved (Merge)
Pretergeek_OpenChat-3.5-0106_8.99B_40Layers-Appended_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Pretergeek/OpenChat-3.5-0106_8.99B_40Layers-Appended" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Pretergeek/OpenChat-3.5-0106_8.99B_40Layers-Appended</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Pretergeek__OpenChat-3.5-0106_8.99B_40Layers-Appended-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Pretergeek/OpenChat-3.5-0106_8.99B_40Layers-Appended
2120720b7fb2ecc27b9c03cc876316fd25b26e40
22.7107
apache-2.0
2
8
true
false
false
true
0.719249
0.59606
59.605957
0.461964
24.057173
0.077795
7.779456
0.307047
7.606264
0.425406
11.775781
0.328956
25.439569
true
false
2024-07-26
2024-07-27
1
Pretergeek/OpenChat-3.5-0106_8.99B_40Layers-Appended (Merge)
Pretergeek_OpenChat-3.5-0106_8.99B_40Layers-Interleaved_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Pretergeek/OpenChat-3.5-0106_8.99B_40Layers-Interleaved" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Pretergeek/OpenChat-3.5-0106_8.99B_40Layers-Interleaved</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Pretergeek__OpenChat-3.5-0106_8.99B_40Layers-Interleaved-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Pretergeek/OpenChat-3.5-0106_8.99B_40Layers-Interleaved
b6dfa36a99179674706d5e859714afa6b8743640
22.64312
apache-2.0
2
8
true
false
false
true
0.727811
0.597583
59.75833
0.46213
24.075506
0.077039
7.703927
0.30453
7.270694
0.424073
11.509115
0.32987
25.54115
true
false
2024-08-10
2024-08-16
1
Pretergeek/OpenChat-3.5-0106_8.99B_40Layers-Interleaved (Merge)
Pretergeek_OpenChat-3.5-0106_9.86B_44Layers-Appended_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Pretergeek/OpenChat-3.5-0106_9.86B_44Layers-Appended" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Pretergeek/OpenChat-3.5-0106_9.86B_44Layers-Appended</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Pretergeek__OpenChat-3.5-0106_9.86B_44Layers-Appended-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Pretergeek/OpenChat-3.5-0106_9.86B_44Layers-Appended
8a7ef4a2c4faf8760650e26e44509920bace633a
22.7107
apache-2.0
2
9
true
false
false
true
0.786763
0.59606
59.605957
0.461964
24.057173
0.077795
7.779456
0.307047
7.606264
0.425406
11.775781
0.328956
25.439569
true
false
2024-07-27
2024-07-27
1
Pretergeek/OpenChat-3.5-0106_9.86B_44Layers-Appended (Merge)
Pretergeek_openchat-3.5-0106_Rebased_Mistral-7B-v0.2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Pretergeek/openchat-3.5-0106_Rebased_Mistral-7B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Pretergeek/openchat-3.5-0106_Rebased_Mistral-7B-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Pretergeek__openchat-3.5-0106_Rebased_Mistral-7B-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Pretergeek/openchat-3.5-0106_Rebased_Mistral-7B-v0.2
31c11027a7320115af1e5c33b41bcace83420fe2
16.0271
apache-2.0
2
7
true
false
false
true
0.621838
0.370621
37.062106
0.362711
10.910768
0.043807
4.380665
0.271812
2.908277
0.48401
20.567969
0.282995
20.332816
false
false
2024-07-21
2024-07-21
0
Pretergeek/openchat-3.5-0106_Rebased_Mistral-7B-v0.2
PrimeIntellect_INTELLECT-1_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/PrimeIntellect/INTELLECT-1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">PrimeIntellect/INTELLECT-1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/PrimeIntellect__INTELLECT-1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
PrimeIntellect/INTELLECT-1
3b8d48b5ce11ee9526495f1db9eb1644518bfce0
3.806302
apache-2.0
55
10
true
false
false
false
0.995318
0.175732
17.57315
0.27598
1.0435
0
0
0.253356
0.447427
0.333938
2.408854
0.112284
1.364879
false
true
2024-11-28
2024-11-29
0
PrimeIntellect/INTELLECT-1
PrimeIntellect_INTELLECT-1_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/PrimeIntellect/INTELLECT-1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">PrimeIntellect/INTELLECT-1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/PrimeIntellect__INTELLECT-1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
PrimeIntellect/INTELLECT-1
3b8d48b5ce11ee9526495f1db9eb1644518bfce0
4.016002
apache-2.0
55
10
true
false
false
false
0.996384
0.175732
17.57315
0.27398
1.0435
0
0
0.25
0
0.375271
4.142187
0.112035
1.337175
false
true
2024-11-28
2024-12-03
0
PrimeIntellect/INTELLECT-1
PrimeIntellect_INTELLECT-1-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/PrimeIntellect/INTELLECT-1-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">PrimeIntellect/INTELLECT-1-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/PrimeIntellect__INTELLECT-1-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
PrimeIntellect/INTELLECT-1-Instruct
a672cbe91f9bd4df58f90619ca3c2acb2eb11294
1.028268
apache-2.0
109
10
true
false
false
true
1.889979
0
0
0.28698
1.749448
0
0
0.248322
0
0.357688
3.710937
0.106383
0.70922
false
true
2024-11-28
2024-11-29
1
PrimeIntellect/INTELLECT-1-Instruct (Merge)
PygmalionAI_pygmalion-6b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTJForCausalLM
<a target="_blank" href="https://huggingface.co/PygmalionAI/pygmalion-6b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">PygmalionAI/pygmalion-6b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/PygmalionAI__pygmalion-6b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
PygmalionAI/pygmalion-6b
2a0d74449c8fbf0378194e95f64aa92e16297294
5.39236
creativeml-openrail-m
735
6
true
false
false
false
31.923119
0.209104
20.910407
0.319889
5.089577
0.006042
0.60423
0.249161
0
0.368354
3.710937
0.118351
2.039007
false
true
2023-01-07
2024-06-12
0
PygmalionAI/pygmalion-6b
Q-bert_MetaMath-1B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Q-bert/MetaMath-1B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Q-bert/MetaMath-1B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Q-bert__MetaMath-1B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Q-bert/MetaMath-1B
da62756f069aba78d07d4c76108e246cb91dbc35
11.324248
0
1
false
false
false
true
0.465028
0.530039
53.003918
0.345069
8.434611
0
0
0.251678
0.223714
0.328917
0.78125
0.149518
5.501995
false
false
2024-09-30
0
Removed
Qwen_QwQ-32B-Preview_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/QwQ-32B-Preview" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/QwQ-32B-Preview</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__QwQ-32B-Preview-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/QwQ-32B-Preview
1032e81cb936c486aae1d33da75b2fbcd5deed4a
30.444123
apache-2.0
1,210
32
true
false
false
true
10.21039
0.403544
40.354371
0.669138
53.387676
0.228852
22.885196
0.281879
4.250559
0.41099
9.807031
0.567819
51.979905
false
true
2024-11-27
2024-11-29
2
Qwen/Qwen2.5-32B
Qwen_Qwen1.5-0.5B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen1.5-0.5B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen1.5-0.5B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen1.5-0.5B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen1.5-0.5B
8f445e3628f3500ee69f24e1303c9f10f5342a39
5.137017
other
145
0
true
false
false
false
0.978737
0.170561
17.056078
0.315354
5.035476
0.004532
0.453172
0.254195
0.559284
0.361625
4.303125
0.130735
3.414967
false
true
2024-01-22
2024-06-13
0
Qwen/Qwen1.5-0.5B
Qwen_Qwen1.5-0.5B-Chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen1.5-0.5B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen1.5-0.5B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen1.5-0.5B-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen1.5-0.5B-Chat
4d14e384a4b037942bb3f3016665157c8bcb70ea
5.564869
other
74
0
true
false
false
true
0.549744
0.180727
18.072714
0.316666
4.318033
0
0
0.269295
2.572707
0.383708
6.063542
0.12126
2.362219
false
true
2024-01-31
2024-06-12
0
Qwen/Qwen1.5-0.5B-Chat
Qwen_Qwen1.5-1.8B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen1.5-1.8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen1.5-1.8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen1.5-1.8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen1.5-1.8B
7846de7ed421727b318d6605a0bfab659da2c067
9.181376
other
43
1
true
false
false
false
0.948871
0.215424
21.542396
0.347612
9.759902
0.026435
2.643505
0.305369
7.38255
0.36051
3.963802
0.188165
9.796099
false
true
2024-01-22
2024-06-13
0
Qwen/Qwen1.5-1.8B
Qwen_Qwen1.5-1.8B-Chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen1.5-1.8B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen1.5-1.8B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen1.5-1.8B-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen1.5-1.8B-Chat
e482ee3f73c375a627a16fdf66fd0c8279743ca6
9.006021
other
48
1
true
false
false
true
0.564329
0.20191
20.190982
0.325591
5.908663
0.004532
0.453172
0.297819
6.375839
0.425969
12.179427
0.180352
8.928044
false
true
2024-01-30
2024-06-12
0
Qwen/Qwen1.5-1.8B-Chat
Qwen_Qwen1.5-110B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen1.5-110B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen1.5-110B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen1.5-110B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen1.5-110B
16659038ecdcc771c1293cf47020fa7cc2750ee8
29.846266
other
93
111
true
false
false
false
71.270888
0.342194
34.219427
0.609996
44.280477
0.247734
24.773414
0.352349
13.646532
0.440844
13.705469
0.53607
48.452275
false
true
2024-04-25
2024-06-13
0
Qwen/Qwen1.5-110B
Qwen_Qwen1.5-110B-Chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen1.5-110B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen1.5-110B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen1.5-110B-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen1.5-110B-Chat
85f86cec25901f2dbd870a86e06756903c9a876a
29.224837
other
123
111
true
false
false
true
72.565293
0.593886
59.388644
0.61838
44.984545
0
0
0.341443
12.192394
0.452167
16.2875
0.482463
42.495937
false
true
2024-04-25
2024-06-12
0
Qwen/Qwen1.5-110B-Chat
Qwen_Qwen1.5-14B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen1.5-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen1.5-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen1.5-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen1.5-14B
dce4b190d34470818e5bec2a92cb8233aaa02ca2
20.514201
other
36
14
true
false
false
false
1.925491
0.290537
29.053689
0.508033
30.063103
0.182024
18.202417
0.294463
5.928412
0.418646
10.464063
0.364362
29.373522
false
true
2024-01-22
2024-06-13
0
Qwen/Qwen1.5-14B
Qwen_Qwen1.5-14B-Chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen1.5-14B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen1.5-14B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen1.5-14B-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen1.5-14B-Chat
9492b22871f43e975435455f5c616c77fe7a50ec
21.023307
other
111
14
true
false
false
true
1.338466
0.476808
47.68082
0.522859
32.756479
0
0
0.270134
2.684564
0.439979
13.930729
0.361785
29.087249
false
true
2024-01-30
2024-06-12
0
Qwen/Qwen1.5-14B-Chat
Qwen_Qwen1.5-32B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen1.5-32B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen1.5-32B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen1.5-32B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen1.5-32B
cefef80dc06a65f89d1d71d0adbc56d335ca2490
27.021817
other
82
32
true
false
false
false
59.967159
0.32973
32.972956
0.571539
38.980352
0.286254
28.625378
0.329698
10.626398
0.427792
12.040625
0.449967
38.885195
false
true
2024-04-01
2024-06-13
0
Qwen/Qwen1.5-32B
Qwen_Qwen1.5-32B-Chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen1.5-32B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen1.5-32B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen1.5-32B-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen1.5-32B-Chat
0997b012af6ddd5465d40465a8415535b2f06cfc
27.193017
other
108
32
true
false
false
true
46.05945
0.55322
55.32199
0.60669
44.554854
0.071752
7.175227
0.306208
7.494407
0.415979
10.197396
0.445728
38.414229
false
true
2024-04-03
2024-06-12
0
Qwen/Qwen1.5-32B-Chat
Qwen_Qwen1.5-4B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen1.5-4B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen1.5-4B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen1.5-4B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen1.5-4B
a66363a0c24e2155c561e4b53c658b1d3965474e
11.327599
other
33
3
true
false
false
false
1.638682
0.244475
24.447466
0.40539
16.249143
0.026435
2.643505
0.276846
3.579418
0.360448
4.822656
0.246011
16.223404
false
true
2024-01-22
2024-06-13
0
Qwen/Qwen1.5-4B
Qwen_Qwen1.5-4B-Chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen1.5-4B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen1.5-4B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen1.5-4B-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen1.5-4B-Chat
a7a4d4945d28bac955554c9abd2f74a71ebbf22f
12.337753
other
38
3
true
false
false
true
0.866151
0.315666
31.566577
0.400555
16.297079
0.010574
1.057402
0.266779
2.237136
0.397781
7.35599
0.239611
15.512337
false
true
2024-01-30
2024-06-12
0
Qwen/Qwen1.5-4B-Chat
Qwen_Qwen1.5-7B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen1.5-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen1.5-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen1.5-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen1.5-7B
831096e3a59a0789a541415da25ef195ceb802fe
15.357504
other
46
7
true
false
false
false
1.827354
0.26843
26.842999
0.45599
23.075769
0.05287
5.287009
0.298658
6.487696
0.410333
9.158333
0.291639
21.293218
false
true
2024-01-22
2024-06-09
0
Qwen/Qwen1.5-7B
Qwen_Qwen1.5-7B-Chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen1.5-7B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen1.5-7B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen1.5-7B-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen1.5-7B-Chat
5f4f5e69ac7f1d508f8369e977de208b4803444b
16.576173
other
164
7
true
false
false
true
1.078827
0.437116
43.711574
0.451005
22.37913
0
0
0.302852
7.04698
0.377906
4.638281
0.29513
21.681073
false
true
2024-01-30
2024-06-12
0
Qwen/Qwen1.5-7B-Chat
Qwen_Qwen1.5-MoE-A2.7B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Qwen2MoeForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen1.5-MoE-A2.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen1.5-MoE-A2.7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen1.5-MoE-A2.7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen1.5-MoE-A2.7B
1a758c50ecb6350748b9ce0a99d2352fd9fc11c9
12.422758
other
195
14
true
true
false
false
9.545613
0.265982
26.598204
0.411352
18.837859
0.001511
0.151057
0.259228
1.230425
0.401344
7.967969
0.277759
19.751034
false
true
2024-02-29
2024-06-13
0
Qwen/Qwen1.5-MoE-A2.7B
Qwen_Qwen1.5-MoE-A2.7B-Chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2MoeForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen1.5-MoE-A2.7B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen1.5-MoE-A2.7B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen1.5-MoE-A2.7B-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen1.5-MoE-A2.7B-Chat
ec052fda178e241c7c443468d2fa1db6618996be
14.823498
other
113
14
true
true
false
true
8.901972
0.379539
37.953851
0.427209
20.041819
0
0
0.274329
3.243848
0.389875
6.334375
0.292304
21.367095
false
true
2024-03-14
2024-06-12
0
Qwen/Qwen1.5-MoE-A2.7B-Chat
Qwen_Qwen2-0.5B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2-0.5B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2-0.5B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2-0.5B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2-0.5B
ff3a49fac17555b8dfc4db6709f480cc8f16a9fe
7.287062
apache-2.0
117
0
true
false
false
false
1.799938
0.187322
18.732186
0.323912
7.918512
0.030211
3.021148
0.260906
1.454139
0.375208
4.601042
0.171958
7.995346
false
true
2024-05-31
2024-11-30
0
Qwen/Qwen2-0.5B
Qwen_Qwen2-0.5B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2-0.5B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2-0.5B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2-0.5B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2-0.5B-Instruct
c291d6fce4804a1d39305f388dd32897d1f7acc4
6.410547
apache-2.0
164
0
true
false
false
true
0.557848
0.224666
22.466611
0.317252
5.876044
0.018127
1.812689
0.246644
0
0.335271
2.408854
0.153092
5.899084
false
true
2024-06-03
2024-06-12
1
Qwen/Qwen2-0.5B
Qwen_Qwen2-1.5B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2-1.5B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2-1.5B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2-1.5B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2-1.5B
8a16abf2848eda07cc5253dec660bf1ce007ad7a
10.445453
apache-2.0
82
1
true
false
false
false
1.108195
0.211327
21.132706
0.357479
11.781834
0.070242
7.024169
0.264262
1.901566
0.365813
3.593229
0.255153
17.239214
false
true
2024-05-31
2024-06-09
0
Qwen/Qwen2-1.5B
Qwen_Qwen2-1.5B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2-1.5B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2-1.5B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2-1.5B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2-1.5B-Instruct
ba1cf1846d7df0a0591d6c00649f57e798519da8
13.990879
apache-2.0
132
1
true
false
false
true
0.658824
0.337123
33.712328
0.385223
13.695347
0.062689
6.268882
0.261745
1.565996
0.429281
12.026823
0.250083
16.675901
false
true
2024-06-03
2024-06-12
0
Qwen/Qwen2-1.5B-Instruct
Qwen_Qwen2-57B-A14B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Qwen2MoeForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2-57B-A14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2-57B-A14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2-57B-A14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2-57B-A14B
973e466c39ba76372a2ae464dbca0af3f5a5a2a9
25.033873
apache-2.0
48
57
true
true
false
false
107.031477
0.31127
31.126965
0.56182
38.875989
0.186556
18.655589
0.306208
7.494407
0.417375
10.538542
0.491606
43.511746
false
true
2024-05-22
2024-06-13
0
Qwen/Qwen2-57B-A14B
Qwen_Qwen2-57B-A14B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2MoeForCausalLM
<a target="_blank" href="https://huggingface.co/Qwen/Qwen2-57B-A14B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2-57B-A14B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Qwen__Qwen2-57B-A14B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Qwen/Qwen2-57B-A14B-Instruct
5ea455a449e61a92a5b194ee06be807647d3e8b5
29.780723
apache-2.0
77
57
true
false
false
true
42.506248
0.633778
63.377837
0.588761
41.785918
0.087613
8.761329
0.331376
10.850112
0.436135
14.183594
0.45753
39.725547
false
true
2024-06-04
2024-08-14
1
Qwen/Qwen2-57B-A14B