eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
6 values
T
stringclasses
6 values
Weight type
stringclasses
2 values
Architecture
stringclasses
52 values
Model
stringlengths
355
689
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
1.03
52
Hub License
stringclasses
26 values
Hub ❤️
int64
0
5.9k
#Params (B)
int64
-1
140
Available on the hub
bool
2 classes
MoE
bool
2 classes
Flagged
bool
2 classes
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.03
107
IFEval Raw
float64
0
0.9
IFEval
float64
0
90
BBH Raw
float64
0.27
0.75
BBH
float64
0.81
63.5
MATH Lvl 5 Raw
float64
0
0.51
MATH Lvl 5
float64
0
50.7
GPQA Raw
float64
0.22
0.44
GPQA
float64
0
24.9
MUSR Raw
float64
0.29
0.6
MUSR
float64
0
38.5
MMLU-PRO Raw
float64
0.1
0.73
MMLU-PRO
float64
0
70
Merged
bool
2 classes
Official Providers
bool
2 classes
Upload To Hub Date
stringclasses
424 values
Submission Date
stringclasses
169 values
Generation
int64
0
10
Base Model
stringlengths
4
102
deepseek-ai_deepseek-moe-16b-chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
DeepseekForCausalLM
<a target="_blank" href="https://huggingface.co/deepseek-ai/deepseek-moe-16b-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">deepseek-ai/deepseek-moe-16b-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/deepseek-ai__deepseek-moe-16b-chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
deepseek-ai/deepseek-moe-16b-chat
eefd8ac7e8dc90e095129fe1a537d5e236b2e57c
10.177322
other
115
16
true
true
false
true
4.593478
0.366299
36.62992
0.327495
6.573749
0.018882
1.888218
0.224832
0
0.38076
5.261719
0.196393
10.710328
false
true
2024-01-09
2024-06-12
0
deepseek-ai/deepseek-moe-16b-chat
dfurman_CalmeRys-78B-Orpo-v0.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/dfurman/CalmeRys-78B-Orpo-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dfurman/CalmeRys-78B-Orpo-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dfurman__CalmeRys-78B-Orpo-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dfurman/CalmeRys-78B-Orpo-v0.1
7988deb48419c3f56bb24c139c23e5c476ec03f8
51.243911
mit
57
77
true
false
false
true
12.996767
0.816327
81.632734
0.726228
61.924764
0.4071
40.70997
0.400168
20.022371
0.590177
36.372135
0.701213
66.801492
false
false
2024-09-24
2024-09-24
1
dfurman/CalmeRys-78B-Orpo-v0.1 (Merge)
dfurman_Llama-3-70B-Orpo-v0.1_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Adapter
?
<a target="_blank" href="https://huggingface.co/dfurman/Llama-3-70B-Orpo-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dfurman/Llama-3-70B-Orpo-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dfurman__Llama-3-70B-Orpo-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dfurman/Llama-3-70B-Orpo-v0.1
6bf3be5f7f427164c879f7a4ec9ccb6b22aa6631
18.17418
llama3
2
70
true
false
false
true
14.440343
0.204907
20.490742
0.465524
24.093817
0.150302
15.030211
0.25755
1.006711
0.453438
16.279688
0.389295
32.143913
false
false
2024-04-26
2024-08-30
1
dfurman/Llama-3-70B-Orpo-v0.1 (Merge)
dfurman_Llama-3-8B-Orpo-v0.1_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Adapter
?
<a target="_blank" href="https://huggingface.co/dfurman/Llama-3-8B-Orpo-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dfurman/Llama-3-8B-Orpo-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dfurman__Llama-3-8B-Orpo-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dfurman/Llama-3-8B-Orpo-v0.1
f02aef830e12a50892ac065826d5eb3dfc7675d1
10.756011
llama3
1
8
true
false
false
true
0.928079
0.283518
28.351773
0.384242
13.680746
0.043807
4.380665
0.260906
1.454139
0.356635
2.246094
0.229804
14.422651
false
false
2024-04-26
2024-08-30
1
dfurman/Llama-3-8B-Orpo-v0.1 (Merge)
dfurman_Llama-3-8B-Orpo-v0.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/dfurman/Llama-3-8B-Orpo-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dfurman/Llama-3-8B-Orpo-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dfurman__Llama-3-8B-Orpo-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dfurman/Llama-3-8B-Orpo-v0.1
f02aef830e12a50892ac065826d5eb3dfc7675d1
11.076158
llama3
1
8
true
false
false
true
0.949861
0.300004
30.000399
0.385297
13.773376
0.041541
4.154079
0.261745
1.565996
0.357875
2.734375
0.228059
14.228723
false
false
2024-04-26
2024-08-30
1
dfurman/Llama-3-8B-Orpo-v0.1 (Merge)
dfurman_Qwen2-72B-Orpo-v0.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/dfurman/Qwen2-72B-Orpo-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dfurman/Qwen2-72B-Orpo-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dfurman__Qwen2-72B-Orpo-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dfurman/Qwen2-72B-Orpo-v0.1
26c7bbaa728822c60bb47b2808972140653aae4c
43.76948
other
4
72
true
false
false
true
12.625332
0.787976
78.79759
0.696902
57.414364
0.38142
38.141994
0.384228
17.897092
0.478427
20.870052
0.545462
49.495789
false
false
2024-07-05
2024-08-22
1
dfurman/Qwen2-72B-Orpo-v0.1 (Merge)
dicta-il_dictalm2.0_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/dicta-il/dictalm2.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dicta-il/dictalm2.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dicta-il__dictalm2.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dicta-il/dictalm2.0
f8ab3208e95a7b44a9a2fbb9bbbdd8ea11be509d
11.882597
apache-2.0
11
7
true
false
false
false
0.674038
0.241327
24.132746
0.401787
16.489846
0.017372
1.73716
0.291946
5.592841
0.381969
5.51276
0.260472
17.83023
false
false
2024-04-10
2024-07-31
0
dicta-il/dictalm2.0
dicta-il_dictalm2.0-instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/dicta-il/dictalm2.0-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dicta-il/dictalm2.0-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dicta-il__dictalm2.0-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dicta-il/dictalm2.0-instruct
257c6023d6ac1bfa12110b7b17e7600da7da4e1e
16.577812
apache-2.0
18
7
true
false
false
true
0.648395
0.441213
44.121265
0.425608
19.688076
0.010574
1.057402
0.302852
7.04698
0.394583
9.722917
0.260472
17.83023
false
false
2024-04-14
2024-07-31
1
dicta-il/dictalm2.0
distilbert_distilgpt2_bfloat16
bfloat16
🟢 pretrained
🟢
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/distilbert/distilgpt2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">distilbert/distilgpt2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/distilbert__distilgpt2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
distilbert/distilgpt2
2290a62682d06624634c1f46a6ad5be0f47f38aa
3.901569
apache-2.0
452
0
true
false
false
false
0.123082
0.0611
6.11001
0.303799
2.83522
0
0
0.259228
1.230425
0.420729
11.157813
0.118684
2.075946
false
true
2022-03-02
2024-06-12
0
distilbert/distilgpt2
divyanshukunwar_SASTRI_1_9B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/divyanshukunwar/SASTRI_1_9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">divyanshukunwar/SASTRI_1_9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/divyanshukunwar__SASTRI_1_9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
divyanshukunwar/SASTRI_1_9B
3afeb5b296b1d6489401105e2ea6fc5c00d09c07
19.383995
apache-2.0
0
5
true
false
false
true
3.896215
0.420729
42.072922
0.46805
23.534216
0.113293
11.329305
0.321309
9.50783
0.383115
5.55599
0.318733
24.303709
false
false
2024-11-20
2024-11-23
1
divyanshukunwar/SASTRI_1_9B (Merge)
djuna_G2-BigGSHT-27B-2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/djuna/G2-BigGSHT-27B-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">djuna/G2-BigGSHT-27B-2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/djuna__G2-BigGSHT-27B-2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
djuna/G2-BigGSHT-27B-2
b52e0c08d19232acebf85b68ee5989cc23c0d519
32.132149
0
27
false
false
false
true
5.025429
0.797443
79.744301
0.641474
48.814372
0
0
0.363255
15.100671
0.407208
9.934375
0.452793
39.199173
false
false
2024-10-29
2024-11-06
1
djuna/G2-BigGSHT-27B-2 (Merge)
djuna_G2-GSHT_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/djuna/G2-GSHT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">djuna/G2-GSHT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/djuna__G2-GSHT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
djuna/G2-GSHT
afa34f893a74af2a21b71f83d7bcc16aa818d157
22.001317
0
10
false
false
false
true
2.151692
0.563012
56.30117
0.526973
30.992059
0.034743
3.47432
0.325503
10.067114
0.400573
8.171615
0.307015
23.001625
false
false
2024-09-09
2024-10-05
1
djuna/G2-GSHT (Merge)
djuna_Gemma-2-gemmama-9b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/djuna/Gemma-2-gemmama-9b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">djuna/Gemma-2-gemmama-9b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/djuna__Gemma-2-gemmama-9b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
djuna/Gemma-2-gemmama-9b
1d6c53ad18970ac082e86bfa0159789b6a6e79c0
25.542507
3
10
false
false
false
true
2.764097
0.77034
77.034047
0.542004
32.916051
0
0
0.33557
11.409396
0.403146
8.459896
0.310921
23.435653
false
false
2024-08-31
2024-10-05
1
djuna/Gemma-2-gemmama-9b (Merge)
djuna_L3.1-ForStHS_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/djuna/L3.1-ForStHS" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">djuna/L3.1-ForStHS</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/djuna__L3.1-ForStHS-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
djuna/L3.1-ForStHS
f5442e1f27e4a0c469504624ea85afdc6907c9cc
28.272628
3
8
false
false
false
true
0.843664
0.781331
78.133131
0.52027
31.391217
0.14577
14.577039
0.291107
5.480984
0.402646
9.664063
0.373504
30.389332
false
false
2024-09-10
2024-09-15
1
djuna/L3.1-ForStHS (Merge)
djuna_L3.1-Promissum_Mane-8B-Della-1.5-calc_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/djuna/L3.1-Promissum_Mane-8B-Della-1.5-calc" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">djuna/L3.1-Promissum_Mane-8B-Della-1.5-calc</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/djuna__L3.1-Promissum_Mane-8B-Della-1.5-calc-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
djuna/L3.1-Promissum_Mane-8B-Della-1.5-calc
67dc71cb877c1ebaeb634e116fc938b223338cf6
29.184843
2
8
false
false
false
true
0.744153
0.723529
72.352912
0.543292
34.879576
0.139728
13.97281
0.314597
8.612975
0.425281
13.026823
0.390376
32.263963
false
false
2024-10-29
2024-10-29
1
djuna/L3.1-Promissum_Mane-8B-Della-1.5-calc (Merge)
djuna_L3.1-Promissum_Mane-8B-Della-calc_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/djuna/L3.1-Promissum_Mane-8B-Della-calc" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">djuna/L3.1-Promissum_Mane-8B-Della-calc</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/djuna__L3.1-Promissum_Mane-8B-Della-calc-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
djuna/L3.1-Promissum_Mane-8B-Della-calc
42c6cd88b8394876cdbcf64e56633ad0a371b5f4
23.4173
1
8
false
false
false
true
0.82272
0.544153
54.415285
0.548588
35.553826
0
0
0.299497
6.599553
0.42299
12.807031
0.380153
31.128103
false
false
2024-10-07
2024-10-20
1
djuna/L3.1-Promissum_Mane-8B-Della-calc (Merge)
djuna_L3.1-Purosani-2-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/djuna/L3.1-Purosani-2-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">djuna/L3.1-Purosani-2-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/djuna__L3.1-Purosani-2-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
djuna/L3.1-Purosani-2-8B
e5acd6277a1286c5e18fcb3e89a836ffc8a75b8f
23.050433
3
8
false
false
false
true
0.864952
0.498815
49.881537
0.518212
31.391343
0.113293
11.329305
0.301174
6.823266
0.381625
8.303125
0.375166
30.574025
false
false
2024-10-04
2024-10-20
1
djuna/L3.1-Purosani-2-8B (Merge)
djuna_L3.1-Suze-Vume-calc_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/djuna/L3.1-Suze-Vume-calc" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">djuna/L3.1-Suze-Vume-calc</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/djuna__L3.1-Suze-Vume-calc-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
djuna/L3.1-Suze-Vume-calc
830c07d136ecd8171805078606f00c4ee69f21c3
25.975608
1
8
false
false
false
true
0.804519
0.729674
72.967393
0.516421
31.136638
0.112538
11.253776
0.281879
4.250559
0.384292
8.303125
0.351479
27.942154
false
false
2024-08-26
2024-09-04
1
djuna/L3.1-Suze-Vume-calc (Merge)
djuna_MN-Chinofun_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/djuna/MN-Chinofun" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">djuna/MN-Chinofun</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/djuna__MN-Chinofun-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
djuna/MN-Chinofun
71b47c86f32e107b407fada44ec6b893c5eb8bb0
24.369131
3
12
false
false
false
true
1.446493
0.611022
61.102209
0.49527
28.483575
0.111782
11.178248
0.296141
6.152125
0.408354
10.377604
0.360289
28.921025
false
false
2024-09-16
2024-09-23
1
djuna/MN-Chinofun (Merge)
djuna_MN-Chinofun-12B-2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/djuna/MN-Chinofun-12B-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">djuna/MN-Chinofun-12B-2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/djuna__MN-Chinofun-12B-2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
djuna/MN-Chinofun-12B-2
d2aab6837c2ad2dfebb18b15549affd9dd2b8723
25.367885
3
12
false
false
false
true
0.977697
0.617067
61.706716
0.503696
29.526084
0.111782
11.178248
0.305369
7.38255
0.426833
13.354167
0.361536
29.059545
false
false
2024-10-23
2024-11-26
1
djuna/MN-Chinofun-12B-2 (Merge)
djuna_MN-Chinofun-12B-3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/djuna/MN-Chinofun-12B-3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">djuna/MN-Chinofun-12B-3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/djuna__MN-Chinofun-12B-3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
djuna/MN-Chinofun-12B-3
fa64c9bc66221946d7425c4eea93828900083d84
18.162867
1
12
false
false
false
true
1.207739
0.305274
30.527445
0.534786
34.219196
0.086858
8.685801
0.26594
2.12528
0.419792
10.907292
0.30261
22.51219
false
false
2024-12-05
2024-12-05
1
djuna/MN-Chinofun-12B-3 (Merge)
djuna_Q2.5-Partron-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/djuna/Q2.5-Partron-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">djuna/Q2.5-Partron-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/djuna__Q2.5-Partron-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
djuna/Q2.5-Partron-7B
3a6d3cca23c0e1c6bcba38887fc819729d5d16cf
27.077248
0
7
false
false
false
true
1.334763
0.732122
73.212188
0.541847
35.257265
0.000755
0.075529
0.297819
6.375839
0.416542
11.067708
0.428275
36.474956
false
false
2024-11-08
2024-11-08
1
djuna/Q2.5-Partron-7B (Merge)
djuna-test-lab_TEST-L3.2-ReWish-3B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/djuna-test-lab/TEST-L3.2-ReWish-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">djuna-test-lab/TEST-L3.2-ReWish-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/djuna-test-lab__TEST-L3.2-ReWish-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
djuna-test-lab/TEST-L3.2-ReWish-3B
0cb7d434c4647faed475f17d74e9047007cd3782
22.445512
1
3
false
false
false
true
0.640631
0.636776
63.677598
0.449541
22.0667
0.129154
12.915408
0.283557
4.474273
0.37775
7.91875
0.312583
23.620346
false
false
2024-10-23
2024-10-24
1
djuna-test-lab/TEST-L3.2-ReWish-3B (Merge)
djuna-test-lab_TEST-L3.2-ReWish-3B-ties-w-base_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/djuna-test-lab/TEST-L3.2-ReWish-3B-ties-w-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">djuna-test-lab/TEST-L3.2-ReWish-3B-ties-w-base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/djuna-test-lab__TEST-L3.2-ReWish-3B-ties-w-base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
djuna-test-lab/TEST-L3.2-ReWish-3B-ties-w-base
ebab6c0266ae7846b2bb9a595a2651a23b031372
22.420117
0
3
false
false
false
true
1.281374
0.635252
63.525224
0.449541
22.0667
0.129154
12.915408
0.283557
4.474273
0.37775
7.91875
0.312583
23.620346
false
false
2024-10-23
2024-10-23
1
djuna-test-lab/TEST-L3.2-ReWish-3B-ties-w-base (Merge)
dnhkng_RYS-Medium_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/dnhkng/RYS-Medium" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dnhkng/RYS-Medium</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dnhkng__RYS-Medium-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dnhkng/RYS-Medium
de09a79e6b2efdcc97490a37b770764e62749fd0
25.944227
mit
3
18
true
false
false
false
2.136378
0.440613
44.061313
0.628473
47.734201
0.077795
7.779456
0.32802
10.402685
0.406927
8.732552
0.432596
36.955157
false
false
2024-07-17
2024-07-17
0
dnhkng/RYS-Medium
dnhkng_RYS-Llama-3-8B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/dnhkng/RYS-Llama-3-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dnhkng/RYS-Llama-3-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dnhkng__RYS-Llama-3-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dnhkng/RYS-Llama-3-8B-Instruct
293ab00d1e2be2752f97d5568fde2b09f6a1caae
21.910187
mit
1
8
true
false
false
true
0.805187
0.695777
69.57772
0.480871
25.373015
0.067976
6.797583
0.25755
1.006711
0.338344
0.292969
0.355718
28.413121
false
false
2024-08-06
2024-08-07
0
dnhkng/RYS-Llama-3-8B-Instruct
dnhkng_RYS-Llama-3-Huge-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/dnhkng/RYS-Llama-3-Huge-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dnhkng/RYS-Llama-3-Huge-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dnhkng__RYS-Llama-3-Huge-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dnhkng/RYS-Llama-3-Huge-Instruct
cfe14a5339e88a7a89f075d9d48215d45f64acaf
34.68177
mit
1
99
true
false
false
true
14.736988
0.768592
76.859178
0.648087
49.073721
0.231118
23.111782
0.260906
1.454139
0.42076
11.928385
0.510971
45.663416
false
false
2024-08-06
2024-08-07
0
dnhkng/RYS-Llama-3-Huge-Instruct
dnhkng_RYS-Llama-3-Large-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/dnhkng/RYS-Llama-3-Large-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dnhkng/RYS-Llama-3-Large-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dnhkng__RYS-Llama-3-Large-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dnhkng/RYS-Llama-3-Large-Instruct
01e3208aaf7bf6d2b09737960c701ec6628977fe
36.094509
mit
1
73
true
false
false
true
9.811517
0.805062
80.506168
0.652527
49.665539
0.23716
23.716012
0.28943
5.257271
0.418031
11.453906
0.513713
45.968159
false
false
2024-08-06
2024-08-07
0
dnhkng/RYS-Llama-3-Large-Instruct
dnhkng_RYS-Llama-3.1-8B-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/dnhkng/RYS-Llama-3.1-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dnhkng/RYS-Llama-3.1-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dnhkng__RYS-Llama-3.1-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dnhkng/RYS-Llama-3.1-8B-Instruct
d4e2393403dcae19860da7c29519c8fe6fbf2fad
26.650662
mit
10
8
true
false
false
true
0.971672
0.768492
76.849205
0.516365
31.085445
0.126133
12.613293
0.267617
2.348993
0.368104
7.679688
0.363946
29.327349
false
false
2024-08-08
2024-08-30
0
dnhkng/RYS-Llama-3.1-8B-Instruct
dnhkng_RYS-Llama3.1-Large_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/dnhkng/RYS-Llama3.1-Large" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dnhkng/RYS-Llama3.1-Large</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dnhkng__RYS-Llama3.1-Large-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dnhkng/RYS-Llama3.1-Large
52cc979de78155b33689efa48f52a8aab184bd86
41.937416
mit
1
81
true
false
false
true
15.406329
0.8492
84.920012
0.689911
55.414864
0.304381
30.438066
0.374161
16.55481
0.455396
17.091146
0.52485
47.2056
false
false
2024-08-11
2024-08-22
0
dnhkng/RYS-Llama3.1-Large
dnhkng_RYS-Phi-3-medium-4k-instruct_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/dnhkng/RYS-Phi-3-medium-4k-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dnhkng/RYS-Phi-3-medium-4k-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dnhkng__RYS-Phi-3-medium-4k-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dnhkng/RYS-Phi-3-medium-4k-instruct
1009e916b1ff8c9a53bc9d8ff48bea2a15ccde26
28.464284
mit
1
17
true
false
false
false
2.310547
0.439139
43.913926
0.622631
46.748971
0.123112
12.311178
0.354866
13.982103
0.425281
11.09349
0.484624
42.736037
false
false
2024-08-06
2024-08-07
0
dnhkng/RYS-Phi-3-medium-4k-instruct
dnhkng_RYS-XLarge_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/dnhkng/RYS-XLarge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dnhkng/RYS-XLarge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dnhkng__RYS-XLarge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dnhkng/RYS-XLarge
0f84dd9dde60f383e1e2821496befb4ce9a11ef6
45.131222
mit
78
77
true
false
false
false
13.576083
0.799566
79.956626
0.705003
58.773567
0.412387
41.238671
0.384228
17.897092
0.496969
23.721094
0.542803
49.200281
false
false
2024-07-24
2024-08-07
0
dnhkng/RYS-XLarge
dnhkng_RYS-XLarge-base_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/dnhkng/RYS-XLarge-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dnhkng/RYS-XLarge-base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dnhkng__RYS-XLarge-base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dnhkng/RYS-XLarge-base
c718b3d9e24916e3b0347d3fdaa5e5a097c2f603
43.970955
mit
6
77
true
false
false
true
13.587524
0.791023
79.102337
0.704729
58.692146
0.371601
37.160121
0.379195
17.225951
0.490271
22.417188
0.543052
49.227985
false
false
2024-08-02
2024-08-30
0
dnhkng/RYS-XLarge-base
dnhkng_RYS-XLarge2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/dnhkng/RYS-XLarge2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dnhkng/RYS-XLarge2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dnhkng__RYS-XLarge2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dnhkng/RYS-XLarge2
3ce16c9427e93e09ce10a28fa644469d49a51113
35.001876
0
77
false
false
false
true
13.375885
0.490197
49.019712
0.657395
51.549936
0.271903
27.190332
0.374161
16.55481
0.450802
17.05026
0.537816
48.646203
false
false
2024-10-11
0
Removed
dreamgen_WizardLM-2-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/dreamgen/WizardLM-2-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dreamgen/WizardLM-2-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dreamgen__WizardLM-2-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dreamgen/WizardLM-2-7B
b5f2d7bff91445a47331dcce588aee009d11d255
14.82719
apache-2.0
36
7
true
false
false
true
0.566725
0.458298
45.829843
0.348679
9.213114
0.030211
3.021148
0.286913
4.9217
0.394094
7.528385
0.266041
18.448951
false
false
2024-04-16
2024-06-27
0
dreamgen/WizardLM-2-7B
dustinwloring1988_Reflexis-8b-chat-v1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/dustinwloring1988/Reflexis-8b-chat-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dustinwloring1988/Reflexis-8b-chat-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dustinwloring1988__Reflexis-8b-chat-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dustinwloring1988/Reflexis-8b-chat-v1
e96bd9694ae87a4f612825310eb7afaea5b0aa28
17.340651
0
8
false
false
false
true
0.891142
0.365775
36.577503
0.46636
24.109958
0.114804
11.480363
0.254195
0.559284
0.375396
4.824479
0.338431
26.492317
false
false
2024-09-14
0
Removed
dustinwloring1988_Reflexis-8b-chat-v2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/dustinwloring1988/Reflexis-8b-chat-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dustinwloring1988/Reflexis-8b-chat-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dustinwloring1988__Reflexis-8b-chat-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dustinwloring1988/Reflexis-8b-chat-v2
817408ebfaa7ba0ea9433e1de4bfa120d38d2a0f
18.364751
0
8
false
false
false
true
0.94037
0.391204
39.120423
0.47238
24.892196
0.121601
12.160121
0.270134
2.684564
0.352635
4.91276
0.337766
26.41844
false
false
2024-09-14
0
Removed
dustinwloring1988_Reflexis-8b-chat-v3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/dustinwloring1988/Reflexis-8b-chat-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dustinwloring1988/Reflexis-8b-chat-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dustinwloring1988__Reflexis-8b-chat-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dustinwloring1988/Reflexis-8b-chat-v3
dcfa1a6a9f94a099286891d732b17cbbe97a644e
20.500265
0
8
false
false
false
true
0.891467
0.536734
53.673364
0.465831
24.168293
0.120846
12.084592
0.24245
0
0.351177
4.763802
0.354804
28.31154
false
false
2024-09-14
0
Removed
dustinwloring1988_Reflexis-8b-chat-v4_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/dustinwloring1988/Reflexis-8b-chat-v4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dustinwloring1988/Reflexis-8b-chat-v4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dustinwloring1988__Reflexis-8b-chat-v4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dustinwloring1988/Reflexis-8b-chat-v4
81e20c2e40f2028818d5d6d27ec9e0d503ae8cc1
18.530939
0
8
false
false
false
true
0.88527
0.469789
46.978905
0.468601
24.33177
0.102719
10.271903
0.23406
0
0.339302
3.046094
0.339013
26.556959
false
false
2024-09-14
0
Removed
dustinwloring1988_Reflexis-8b-chat-v5_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/dustinwloring1988/Reflexis-8b-chat-v5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dustinwloring1988/Reflexis-8b-chat-v5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dustinwloring1988__Reflexis-8b-chat-v5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dustinwloring1988/Reflexis-8b-chat-v5
12970eec99f458a3982eb502b71b6df0bc74bb52
18.586622
0
8
false
false
false
true
0.913096
0.423752
42.375231
0.478169
25.195784
0.124622
12.462236
0.270973
2.796421
0.335365
4.053906
0.321725
24.636155
false
false
2024-09-14
0
Removed
dustinwloring1988_Reflexis-8b-chat-v6_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/dustinwloring1988/Reflexis-8b-chat-v6" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dustinwloring1988/Reflexis-8b-chat-v6</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dustinwloring1988__Reflexis-8b-chat-v6-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dustinwloring1988/Reflexis-8b-chat-v6
a0b30a21a8eea9a32a2767755dc2dbd44eeb383f
20.445597
0
8
false
false
false
true
0.899203
0.493894
49.389398
0.480954
26.116103
0.135952
13.595166
0.262584
1.677852
0.375333
4.35
0.347906
27.545065
false
false
2024-09-14
0
Removed
dustinwloring1988_Reflexis-8b-chat-v7_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/dustinwloring1988/Reflexis-8b-chat-v7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dustinwloring1988/Reflexis-8b-chat-v7</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dustinwloring1988__Reflexis-8b-chat-v7-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dustinwloring1988/Reflexis-8b-chat-v7
e8d990012ccd855e65d51cb7cfd1762632a8f217
18.843739
0
8
false
false
false
true
0.902111
0.398048
39.804829
0.480983
25.987497
0.148036
14.803625
0.261745
1.565996
0.322156
1.536198
0.364279
29.364288
false
false
2024-09-14
0
Removed
dwikitheduck_gemma-2-2b-id_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/dwikitheduck/gemma-2-2b-id" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dwikitheduck/gemma-2-2b-id</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dwikitheduck__gemma-2-2b-id-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dwikitheduck/gemma-2-2b-id
6f191d4a7618664619adda1cd96d9d1bf72f33b2
14.182478
gemma
0
2
true
false
false
true
4.599594
0.387856
38.785644
0.396217
15.415129
0.005287
0.528701
0.299497
6.599553
0.415427
10.728385
0.217337
13.037456
false
false
2024-10-24
2024-11-14
0
dwikitheduck/gemma-2-2b-id
dwikitheduck_gemma-2-2b-id-inst_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/dwikitheduck/gemma-2-2b-id-inst" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dwikitheduck/gemma-2-2b-id-inst</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dwikitheduck__gemma-2-2b-id-inst-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dwikitheduck/gemma-2-2b-id-inst
1c046ade199128da926004e154698546d65e3084
14.182478
gemma
0
2
true
false
false
true
1.410396
0.387856
38.785644
0.396217
15.415129
0.005287
0.528701
0.299497
6.599553
0.415427
10.728385
0.217337
13.037456
false
false
2024-10-24
2024-11-24
0
dwikitheduck/gemma-2-2b-id-inst
dwikitheduck_gemma-2-2b-id-instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/dwikitheduck/gemma-2-2b-id-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dwikitheduck/gemma-2-2b-id-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dwikitheduck__gemma-2-2b-id-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dwikitheduck/gemma-2-2b-id-instruct
1c046ade199128da926004e154698546d65e3084
14.182478
gemma
0
2
true
false
false
true
1.416907
0.387856
38.785644
0.396217
15.415129
0.005287
0.528701
0.299497
6.599553
0.415427
10.728385
0.217337
13.037456
false
false
2024-10-24
2024-11-15
0
dwikitheduck/gemma-2-2b-id-instruct
dwikitheduck_gen-inst-1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/dwikitheduck/gen-inst-1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dwikitheduck/gen-inst-1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dwikitheduck__gen-inst-1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dwikitheduck/gen-inst-1
73180b0a57469bbd12f7d037a1cc25e53c252ad6
34.032262
apache-2.0
0
14
true
false
false
true
1.529872
0.775011
77.501141
0.641993
48.316742
0.044562
4.456193
0.371644
16.219239
0.420542
12.267708
0.508893
45.43255
false
false
2024-11-18
2024-11-24
2
Qwen/Qwen2.5-14B
dwikitheduck_gen-try1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/dwikitheduck/gen-try1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dwikitheduck/gen-try1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dwikitheduck__gen-try1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dwikitheduck/gen-try1
9c2cab728518e179e5d8891f3f9775515f15cea2
34.830053
apache-2.0
0
14
true
false
false
true
1.583081
0.752205
75.220526
0.635851
47.413129
0.135196
13.519637
0.341443
12.192394
0.441563
14.961979
0.511054
45.672651
false
false
2024-11-11
2024-11-12
1
dwikitheduck/gen-try1 (Merge)
dwikitheduck_gen-try1-notemp_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/dwikitheduck/gen-try1-notemp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dwikitheduck/gen-try1-notemp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dwikitheduck__gen-try1-notemp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dwikitheduck/gen-try1-notemp
391925b02f6cd60e7c4ef1321fe89a92d6b9fdf0
29.669185
0
14
false
false
false
false
1.895606
0.26271
26.270961
0.626267
45.749093
0.274169
27.416918
0.354027
13.870246
0.471417
17.927083
0.521027
46.780807
false
false
2024-11-13
0
Removed
dzakwan_dzakwan-MoE-4x7b-Beta_float16
float16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/dzakwan/dzakwan-MoE-4x7b-Beta" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dzakwan/dzakwan-MoE-4x7b-Beta</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dzakwan__dzakwan-MoE-4x7b-Beta-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dzakwan/dzakwan-MoE-4x7b-Beta
e89f82f2afa1961335de5a6d6d05bd850d1d61d9
20.756715
apache-2.0
0
24
true
true
false
false
1.456028
0.44426
44.426012
0.514044
32.074208
0.077039
7.703927
0.286074
4.809843
0.42674
12.109115
0.310755
23.417184
true
false
2024-05-26
2024-08-05
1
dzakwan/dzakwan-MoE-4x7b-Beta (Merge)
ehristoforu_Gemma2-9B-it-psy10k-mental_health_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/ehristoforu/Gemma2-9B-it-psy10k-mental_health" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ehristoforu/Gemma2-9B-it-psy10k-mental_health</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ehristoforu__Gemma2-9B-it-psy10k-mental_health-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ehristoforu/Gemma2-9B-it-psy10k-mental_health
4adc2d61d530d23026493d29e6191e06cf549fc6
26.764494
apache-2.0
1
9
true
false
false
true
2.27683
0.588666
58.866585
0.553938
35.566009
0.137462
13.746224
0.337248
11.63311
0.408604
9.342188
0.382896
31.432846
false
false
2024-07-16
2024-07-31
4
google/gemma-2-9b
ehristoforu_Gemma2-9b-it-train6_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/ehristoforu/Gemma2-9b-it-train6" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ehristoforu/Gemma2-9b-it-train6</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ehristoforu__Gemma2-9b-it-train6-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ehristoforu/Gemma2-9b-it-train6
e72bf00b427c22c48b468818cf75300a373a0c8a
28.897532
apache-2.0
2
9
true
false
false
true
1.993683
0.702522
70.252153
0.589809
40.987625
0.0929
9.29003
0.328859
10.514541
0.408417
9.652083
0.394199
32.688756
false
false
2024-07-22
2024-07-31
8
google/gemma-2-9b
ehristoforu_HappyLlama1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ehristoforu/HappyLlama1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ehristoforu/HappyLlama1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ehristoforu__HappyLlama1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ehristoforu/HappyLlama1
9bee1c404de70fc0ebe3cbcd2af2303a313a24be
26.043033
apache-2.0
0
8
true
false
false
true
0.714361
0.736269
73.626866
0.499573
28.499773
0.101208
10.120846
0.283557
4.474273
0.428687
11.252604
0.354555
28.283836
false
false
2024-11-29
2024-11-30
1
voidful/Llama-3.2-8B-Instruct
ehristoforu_RQwen-v0.1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/ehristoforu/RQwen-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ehristoforu/RQwen-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ehristoforu__RQwen-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ehristoforu/RQwen-v0.1
96d013d2db2ae47be9da1d1cd5b83782bd8f4096
32.480002
apache-2.0
2
14
true
false
false
true
1.708914
0.762497
76.249684
0.644644
48.490852
0.029456
2.945619
0.325503
10.067114
0.413906
10.438281
0.520196
46.68846
false
false
2024-11-24
2024-11-24
1
ehristoforu/RQwen-v0.1 (Merge)
ehristoforu_RQwen-v0.2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/ehristoforu/RQwen-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ehristoforu/RQwen-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ehristoforu__RQwen-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ehristoforu/RQwen-v0.2
102ff435814388f4da9e7ebc25c5fbae7120638a
35.436608
apache-2.0
1
14
true
false
false
true
1.294243
0.750357
75.035683
0.642689
48.683837
0.191088
19.108761
0.337248
11.63311
0.420667
11.95
0.515874
46.208259
false
false
2024-11-24
2024-11-25
2
ehristoforu/RQwen-v0.1 (Merge)
ehristoforu_SoRu-0009_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/ehristoforu/SoRu-0009" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ehristoforu/SoRu-0009</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ehristoforu__SoRu-0009-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ehristoforu/SoRu-0009
fe4f439882175c3cad8a0f08f7b14d18318b53d1
5.947773
apache-2.0
0
0
true
false
false
true
0.511931
0.258188
25.818827
0.314998
5.137458
0
0
0.260906
1.454139
0.336948
0.61849
0.12392
2.657728
false
false
2024-11-26
2024-11-27
10
Vikhrmodels/Vikhr-Qwen-2.5-0.5b-Instruct (Merge)
ehristoforu_mllama-3.1-8b-instruct_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ehristoforu/mllama-3.1-8b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ehristoforu/mllama-3.1-8b-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ehristoforu__mllama-3.1-8b-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ehristoforu/mllama-3.1-8b-instruct
f7be209bee659916c03b6a3b77e67237cfed2c12
18.615902
0
8
false
false
false
true
0.744619
0.345791
34.579139
0.471766
26.370934
0.273414
27.34139
0.270134
2.684564
0.338
3.683333
0.253324
17.036052
false
false
2024-12-04
2024-12-04
1
ehristoforu/mllama-3.1-8b-instruct (Merge)
ehristoforu_mllama-3.1-8b-it_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ehristoforu/mllama-3.1-8b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ehristoforu/mllama-3.1-8b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ehristoforu__mllama-3.1-8b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ehristoforu/mllama-3.1-8b-it
5dc167a466759e5d60c073dca4e938463e2fd813
21.296434
0
8
false
false
false
false
0.733803
0.387882
38.788193
0.486803
28.024834
0.327039
32.703927
0.276846
3.579418
0.334865
6.658073
0.262217
18.024158
false
false
2024-12-04
2024-12-04
1
ehristoforu/mllama-3.1-8b-it (Merge)
elinas_Chronos-Gold-12B-1.0_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/elinas/Chronos-Gold-12B-1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">elinas/Chronos-Gold-12B-1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/elinas__Chronos-Gold-12B-1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
elinas/Chronos-Gold-12B-1.0
cf76a4621b9dfc0c2e6d930756e6c7c9ce2b260b
21.488289
apache-2.0
37
12
true
false
false
true
1.502531
0.316566
31.65656
0.551466
35.908947
0.049094
4.909366
0.317953
9.060403
0.47399
19.415365
0.351812
27.979093
true
false
2024-08-21
2024-09-15
1
mistralai/Mistral-Nemo-Base-2407
ell44ot_gemma-2b-def_float16
float16
🟢 pretrained
🟢
Original
GemmaModel
<a target="_blank" href="https://huggingface.co/ell44ot/gemma-2b-def" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ell44ot/gemma-2b-def</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ell44ot__gemma-2b-def-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ell44ot/gemma-2b-def
f9f1f882322360354fbc7a71d44d9b0b9ddd87ee
8.009626
apache-2.0
0
1
true
false
false
false
0.447298
0.269304
26.930433
0.315865
4.58642
0.017372
1.73716
0.27349
3.131991
0.367021
5.310938
0.157247
6.360816
false
false
2024-11-28
2024-11-28
1
ell44ot/gemma-2b-def (Merge)
euclaise_ReMask-3B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
StableLmForCausalLM
<a target="_blank" href="https://huggingface.co/euclaise/ReMask-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">euclaise/ReMask-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/euclaise__ReMask-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
euclaise/ReMask-3B
e094dae96097c2bc6f758101ee269c089b65a2cf
7.25664
cc-by-sa-4.0
15
2
true
false
false
true
0.44684
0.241927
24.192698
0.351678
8.742083
0.017372
1.73716
0.266779
2.237136
0.334094
2.661719
0.135721
3.969046
false
false
2024-03-28
2024-08-10
0
euclaise/ReMask-3B
experiment-llm_exp-3-q-r_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/experiment-llm/exp-3-q-r" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">experiment-llm/exp-3-q-r</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/experiment-llm__exp-3-q-r-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
experiment-llm/exp-3-q-r
d4300d83f75f6d95fe44a18aa0099e37dcd7868a
28.786888
apache-2.0
0
7
true
false
false
true
0.733656
0.603579
60.357851
0.539716
33.994917
0.23565
23.564955
0.293624
5.816555
0.431542
12.142708
0.431599
36.844341
false
false
2024-12-02
2024-12-02
4
rombodawg/Rombos-LLM-V2.5-Qwen-7b (Merge)
facebook_opt-1.3b_float16
float16
🟢 pretrained
🟢
Original
OPTForCausalLM
<a target="_blank" href="https://huggingface.co/facebook/opt-1.3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">facebook/opt-1.3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/facebook__opt-1.3b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
facebook/opt-1.3b
3f5c25d0bc631cb57ac65913f76e22c2dfb61d62
5.251513
other
160
1
true
false
false
false
0.403005
0.23833
23.832985
0.309395
3.648052
0.007553
0.755287
0.24245
0
0.342
2.083333
0.110705
1.189421
false
true
2022-05-11
2024-06-12
0
facebook/opt-1.3b
facebook_opt-30b_float16
float16
🟢 pretrained
🟢
Original
OPTForCausalLM
<a target="_blank" href="https://huggingface.co/facebook/opt-30b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">facebook/opt-30b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/facebook__opt-30b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
facebook/opt-30b
ceea0a90ac0f6fae7c2c34bcb40477438c152546
6.201345
other
133
30
true
false
false
false
2.999845
0.245299
24.529914
0.307034
3.498429
0.006042
0.60423
0.269295
2.572707
0.360417
4.185417
0.116356
1.817376
false
true
2022-05-11
2024-06-12
0
facebook/opt-30b
failspy_Llama-3-8B-Instruct-MopeyMule_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/failspy/Llama-3-8B-Instruct-MopeyMule" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">failspy/Llama-3-8B-Instruct-MopeyMule</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/failspy__Llama-3-8B-Instruct-MopeyMule-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
failspy/Llama-3-8B-Instruct-MopeyMule
d1cbf407efe727c6b9fc94f22d51ff4915e1856e
15.612956
other
73
8
true
false
false
true
0.823136
0.675044
67.504444
0.383874
13.620496
0.018127
1.812689
0.239094
0
0.351302
2.246094
0.176446
8.494016
false
false
2024-05-30
2024-09-21
0
failspy/Llama-3-8B-Instruct-MopeyMule
failspy_Llama-3-8B-Instruct-abliterated_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/failspy/Llama-3-8B-Instruct-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">failspy/Llama-3-8B-Instruct-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/failspy__Llama-3-8B-Instruct-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
failspy/Llama-3-8B-Instruct-abliterated
dd67dd055661e4cbcedb0ed2431693d9cc3be6e0
19.177668
llama3
8
8
true
false
false
true
0.741906
0.590889
59.088884
0.435375
18.864599
0.037764
3.776435
0.276007
3.467562
0.411583
10.514583
0.274186
19.353945
false
false
2024-05-07
2024-07-03
0
failspy/Llama-3-8B-Instruct-abliterated
failspy_Meta-Llama-3-70B-Instruct-abliterated-v3.5_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/failspy/Meta-Llama-3-70B-Instruct-abliterated-v3.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">failspy/Meta-Llama-3-70B-Instruct-abliterated-v3.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/failspy__Meta-Llama-3-70B-Instruct-abliterated-v3.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
failspy/Meta-Llama-3-70B-Instruct-abliterated-v3.5
fc951b03d92972ab52ad9392e620eba6173526b9
30.204883
llama3
39
70
true
false
false
true
9.204711
0.774687
77.468672
0.57471
37.871333
0.132931
13.293051
0.29698
6.263982
0.398187
7.973438
0.445229
38.358821
false
false
2024-05-28
2024-08-30
0
failspy/Meta-Llama-3-70B-Instruct-abliterated-v3.5
failspy_Phi-3-medium-4k-instruct-abliterated-v3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/failspy/Phi-3-medium-4k-instruct-abliterated-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">failspy/Phi-3-medium-4k-instruct-abliterated-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/failspy__Phi-3-medium-4k-instruct-abliterated-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
failspy/Phi-3-medium-4k-instruct-abliterated-v3
959b09eacf6cae85a8eb21b25e998addc89a367b
31.775592
mit
22
13
true
false
false
true
1.520981
0.63193
63.192995
0.63048
46.732839
0.154834
15.483384
0.317114
8.948546
0.460417
18.51875
0.439993
37.777039
false
false
2024-05-22
2024-07-29
0
failspy/Phi-3-medium-4k-instruct-abliterated-v3
failspy_llama-3-70B-Instruct-abliterated_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/failspy/llama-3-70B-Instruct-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">failspy/llama-3-70B-Instruct-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/failspy__llama-3-70B-Instruct-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
failspy/llama-3-70B-Instruct-abliterated
53ae9dafe8b3d163e05d75387575f8e9f43253d0
36.091429
llama3
95
70
true
false
false
true
9.374129
0.802339
80.233891
0.646485
48.939818
0.255287
25.528701
0.28943
5.257271
0.41276
10.528385
0.514545
46.060505
false
false
2024-05-07
2024-07-03
0
failspy/llama-3-70B-Instruct-abliterated
fblgit_TheBeagle-v2beta-32B-MGS_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/fblgit/TheBeagle-v2beta-32B-MGS" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fblgit/TheBeagle-v2beta-32B-MGS</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/fblgit__TheBeagle-v2beta-32B-MGS-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fblgit/TheBeagle-v2beta-32B-MGS
56830f63e4a40378b7721ae966637b4678cc8784
41.622408
other
9
32
true
false
false
false
32.879107
0.518074
51.807427
0.703263
58.027976
0.433535
43.353474
0.38255
17.673378
0.50075
24.260417
0.591506
54.611776
false
false
2024-10-20
2024-10-30
1
fblgit/TheBeagle-v2beta-32B-MGS (Merge)
fblgit_TheBeagle-v2beta-32B-MGS_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/fblgit/TheBeagle-v2beta-32B-MGS" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fblgit/TheBeagle-v2beta-32B-MGS</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/fblgit__TheBeagle-v2beta-32B-MGS-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fblgit/TheBeagle-v2beta-32B-MGS
56830f63e4a40378b7721ae966637b4678cc8784
40.28667
other
9
32
true
false
false
false
11.366068
0.450305
45.030519
0.703542
58.06603
0.39426
39.425982
0.401007
20.134228
0.502115
24.497656
0.59109
54.565603
false
false
2024-10-20
2024-10-20
1
fblgit/TheBeagle-v2beta-32B-MGS (Merge)
fblgit_UNA-SimpleSmaug-34b-v1beta_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/fblgit/UNA-SimpleSmaug-34b-v1beta" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fblgit/UNA-SimpleSmaug-34b-v1beta</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/fblgit__UNA-SimpleSmaug-34b-v1beta-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fblgit/UNA-SimpleSmaug-34b-v1beta
4b62fccfc7e44c0a02c11a5279d98fafa6b922ba
23.121397
apache-2.0
20
34
true
false
false
true
3.164466
0.455626
45.562552
0.528665
32.775789
0.001511
0.151057
0.317114
8.948546
0.425563
11.961979
0.453956
39.328457
false
false
2024-02-05
2024-06-30
2
jondurbin/bagel-34b-v0.2
fblgit_UNA-TheBeagle-7b-v1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/fblgit/UNA-TheBeagle-7b-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fblgit/UNA-TheBeagle-7b-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/fblgit__UNA-TheBeagle-7b-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fblgit/UNA-TheBeagle-7b-v1
866d3ee19f983728e21a624f8a27574960073f27
19.633583
cc-by-nc-nd-4.0
36
7
true
false
false
false
0.560639
0.368872
36.887237
0.502869
30.173397
0.076284
7.628399
0.284396
4.58613
0.456438
16.088021
0.301945
22.438313
false
false
2024-01-09
2024-06-30
0
fblgit/UNA-TheBeagle-7b-v1
fblgit_UNA-ThePitbull-21.4B-v2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/fblgit/UNA-ThePitbull-21.4B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fblgit/UNA-ThePitbull-21.4B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/fblgit__UNA-ThePitbull-21.4B-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fblgit/UNA-ThePitbull-21.4B-v2
f12aac93ae9c852550a16816e16116c4f8e7dec0
22.799983
afl-3.0
15
21
true
false
false
true
2.298414
0.379039
37.903873
0.635039
46.788074
0.108006
10.800604
0.302013
6.935123
0.392167
6.420833
0.351563
27.951389
false
false
2024-05-28
2024-06-30
0
fblgit/UNA-ThePitbull-21.4B-v2
fblgit_cybertron-v4-qw7B-MGS_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/fblgit/cybertron-v4-qw7B-MGS" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fblgit/cybertron-v4-qw7B-MGS</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/fblgit__cybertron-v4-qw7B-MGS-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fblgit/cybertron-v4-qw7B-MGS
ea2aaf4f4000190235722a9ad4f5cd9e9091a64e
31.207648
other
11
7
true
false
false
false
1.246739
0.626385
62.638466
0.559177
37.041623
0.27719
27.719033
0.310403
8.053691
0.437094
13.203385
0.447307
38.589687
false
false
2024-10-29
2024-10-29
1
fblgit/cybertron-v4-qw7B-MGS (Merge)
fblgit_cybertron-v4-qw7B-UNAMGS_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/fblgit/cybertron-v4-qw7B-UNAMGS" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fblgit/cybertron-v4-qw7B-UNAMGS</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/fblgit__cybertron-v4-qw7B-UNAMGS-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fblgit/cybertron-v4-qw7B-UNAMGS
ce9b1e991908f5b89f63a2e3212cf9a066906ed2
31.815867
other
5
7
true
false
false
false
1.332592
0.608425
60.842454
0.564251
37.707173
0.299094
29.909366
0.331376
10.850112
0.434333
12.691667
0.45005
38.89443
false
false
2024-11-18
2024-11-18
1
fblgit/cybertron-v4-qw7B-UNAMGS (Merge)
fblgit_juanako-7b-UNA_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/fblgit/juanako-7b-UNA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fblgit/juanako-7b-UNA</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/fblgit__juanako-7b-UNA-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fblgit/juanako-7b-UNA
b8ac85b603d5ee1ac619b2e1d0b3bb86c4eecb0c
20.825304
apache-2.0
23
7
true
false
false
false
0.63179
0.483728
48.372762
0.507001
30.415072
0.031722
3.172205
0.296141
6.152125
0.4645
17.1625
0.277094
19.677157
false
false
2023-11-27
2024-06-30
0
fblgit/juanako-7b-UNA
fblgit_miniclaus-qw1.5B-UNAMGS_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/fblgit/miniclaus-qw1.5B-UNAMGS" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fblgit/miniclaus-qw1.5B-UNAMGS</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/fblgit__miniclaus-qw1.5B-UNAMGS-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fblgit/miniclaus-qw1.5B-UNAMGS
de590536ba82ffb7b4001dffb5f8b60d2087c319
16.868868
other
8
1
true
false
false
false
0.591743
0.334801
33.480055
0.423859
18.562864
0.098187
9.818731
0.291946
5.592841
0.429344
12.234635
0.293717
21.524084
false
false
2024-11-01
2024-11-01
2
Qwen/Qwen2.5-1.5B
fblgit_pancho-v1-qw25-3B-UNAMGS_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/fblgit/pancho-v1-qw25-3B-UNAMGS" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fblgit/pancho-v1-qw25-3B-UNAMGS</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/fblgit__pancho-v1-qw25-3B-UNAMGS-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fblgit/pancho-v1-qw25-3B-UNAMGS
01143501cbc2c90961be5397c6945c6789815a60
23.646637
other
1
3
true
false
false
false
1.566729
0.536134
53.613412
0.492583
28.66965
0.14426
14.425982
0.29698
6.263982
0.40274
8.175781
0.376579
30.731014
false
false
2024-11-04
2024-11-12
2
Qwen/Qwen2.5-3B
fblgit_una-cybertron-7b-v2-bf16_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/fblgit/una-cybertron-7b-v2-bf16" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fblgit/una-cybertron-7b-v2-bf16</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/fblgit__una-cybertron-7b-v2-bf16-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fblgit/una-cybertron-7b-v2-bf16
7ab101a153740aec39e95ec02831c56f4eab7910
17.17956
apache-2.0
116
7
true
false
false
true
0.634206
0.473711
47.371086
0.397339
14.966965
0.03852
3.851964
0.297819
6.375839
0.447323
14.482031
0.244265
16.029477
false
false
2023-12-02
2024-06-30
0
fblgit/una-cybertron-7b-v2-bf16
flammenai_Llama3.1-Flammades-70B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/flammenai/Llama3.1-Flammades-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flammenai/Llama3.1-Flammades-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/flammenai__Llama3.1-Flammades-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flammenai/Llama3.1-Flammades-70B
48909a734460e667e3a7e91bd25f124ec3b2ba74
35.898954
llama3.1
2
70
true
false
false
true
10.284833
0.705844
70.584383
0.665972
52.547943
0.143505
14.350453
0.354027
13.870246
0.487052
22.348177
0.475233
41.692524
false
false
2024-10-12
2024-10-13
1
flammenai/Llama3.1-Flammades-70B (Merge)
flammenai_Mahou-1.2a-llama3-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/flammenai/Mahou-1.2a-llama3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flammenai/Mahou-1.2a-llama3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/flammenai__Mahou-1.2a-llama3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flammenai/Mahou-1.2a-llama3-8B
3318b6f5f1839644bee287a3e5390f3e9f565a9e
21.841614
llama3
6
8
true
false
false
false
0.932412
0.509257
50.925655
0.509366
28.972588
0.086858
8.685801
0.288591
5.145414
0.384667
6.016667
0.381732
31.303561
false
false
2024-05-25
2024-09-03
1
flammenai/Mahou-1.2a-llama3-8B (Merge)
flammenai_Mahou-1.2a-mistral-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/flammenai/Mahou-1.2a-mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flammenai/Mahou-1.2a-mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/flammenai__Mahou-1.2a-mistral-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flammenai/Mahou-1.2a-mistral-7B
d45f61cca04da0c3359573102853fca1a0d3b252
19.503462
apache-2.0
6
7
true
false
false
false
1.805622
0.455201
45.520109
0.511811
31.16675
0.064199
6.41994
0.271812
2.908277
0.389625
6.969792
0.316323
24.035904
false
false
2024-05-18
2024-09-03
1
flammenai/Mahou-1.2a-mistral-7B (Merge)
flammenai_Mahou-1.5-llama3.1-70B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/flammenai/Mahou-1.5-llama3.1-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flammenai/Mahou-1.5-llama3.1-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/flammenai__Mahou-1.5-llama3.1-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flammenai/Mahou-1.5-llama3.1-70B
49f45cc4c21e2ba7ed5c5e71f90ffd0bd9169e2d
36.237159
llama3.1
7
70
true
false
false
true
10.259992
0.714662
71.466154
0.665086
52.369577
0.143505
14.350453
0.354027
13.870246
0.495021
23.710938
0.4749
41.655585
false
false
2024-10-14
2024-10-14
1
flammenai/Mahou-1.5-llama3.1-70B (Merge)
flammenai_Mahou-1.5-mistral-nemo-12B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/flammenai/Mahou-1.5-mistral-nemo-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flammenai/Mahou-1.5-mistral-nemo-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/flammenai__Mahou-1.5-mistral-nemo-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flammenai/Mahou-1.5-mistral-nemo-12B
852561e74f1785bf7225bb28395db1fd9431fe31
26.381801
apache-2.0
18
12
true
false
false
true
1.482632
0.675144
67.514417
0.552236
36.26051
0.056647
5.664653
0.276007
3.467562
0.452042
16.471875
0.360206
28.911791
false
false
2024-10-06
2024-10-07
1
flammenai/Mahou-1.5-mistral-nemo-12B (Merge)
flammenai_flammen15-gutenberg-DPO-v1-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/flammenai/flammen15-gutenberg-DPO-v1-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flammenai/flammen15-gutenberg-DPO-v1-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/flammenai__flammen15-gutenberg-DPO-v1-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flammenai/flammen15-gutenberg-DPO-v1-7B
550cd9548cba1265cb1771c85ebe498789fdecb5
21.574934
apache-2.0
2
7
true
false
false
false
0.62753
0.479806
47.98058
0.520298
32.665113
0.074018
7.401813
0.284396
4.58613
0.429313
12.530729
0.318567
24.285239
false
false
2024-04-05
2024-07-10
1
flammenai/flammen15-gutenberg-DPO-v1-7B (Merge)
fluently-lm_Llama-TI-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/fluently-lm/Llama-TI-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fluently-lm/Llama-TI-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/fluently-lm__Llama-TI-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fluently-lm/Llama-TI-8B
2ab7eb6daca1c850cc65cec04f4d374b1041d824
21.074708
apache-2.0
1
8
true
false
false
false
0.680392
0.288039
28.803907
0.520086
31.984333
0.19713
19.712991
0.296141
6.152125
0.410271
12.683854
0.343999
27.111037
false
false
2024-12-07
2024-12-07
1
meta-llama/Llama-3.1-8B
freewheelin_free-evo-qwen72b-v0.8-re_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/freewheelin/free-evo-qwen72b-v0.8-re" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">freewheelin/free-evo-qwen72b-v0.8-re</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/freewheelin__free-evo-qwen72b-v0.8-re-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
freewheelin/free-evo-qwen72b-v0.8-re
24e301d8fbef8ada12be42156b01c827ff594962
32.424578
mit
4
72
true
false
false
false
11.789791
0.533087
53.308665
0.612748
45.317403
0.177492
17.749245
0.356544
14.205817
0.487167
20.9625
0.487035
43.003842
false
false
2024-05-02
2024-09-15
0
freewheelin/free-evo-qwen72b-v0.8-re
freewheelin_free-solar-evo-v0.1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/freewheelin/free-solar-evo-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">freewheelin/free-solar-evo-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/freewheelin__free-solar-evo-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
freewheelin/free-solar-evo-v0.1
233efd607ae0abbd7b46eded2ee7889892b7bdbb
16.295571
mit
1
10
true
false
false
true
0.801111
0.205007
20.500716
0.450221
22.635183
0.000755
0.075529
0.291107
5.480984
0.494583
22.25625
0.341423
26.824764
false
false
2024-04-18
2024-08-07
0
freewheelin/free-solar-evo-v0.1
freewheelin_free-solar-evo-v0.11_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/freewheelin/free-solar-evo-v0.11" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">freewheelin/free-solar-evo-v0.11</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/freewheelin__free-solar-evo-v0.11-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
freewheelin/free-solar-evo-v0.11
17fc24a557bd3c3836abc9f6a367c803cba0cccd
16.641294
mit
0
10
true
false
false
true
0.813502
0.202659
20.265894
0.454516
23.182425
0
0
0.285235
4.697987
0.505219
24.285677
0.346742
27.41578
false
false
2024-04-24
2024-08-07
0
freewheelin/free-solar-evo-v0.11
freewheelin_free-solar-evo-v0.13_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/freewheelin/free-solar-evo-v0.13" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">freewheelin/free-solar-evo-v0.13</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/freewheelin__free-solar-evo-v0.13-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
freewheelin/free-solar-evo-v0.13
2a7eb72f84c54898630f9db470eee0f936a64396
17.204491
mit
1
10
true
false
false
true
0.815956
0.23206
23.205982
0.455484
23.354204
0
0
0.288591
5.145414
0.505156
24.077865
0.346991
27.443484
false
false
2024-04-28
2024-08-07
0
freewheelin/free-solar-evo-v0.13
gabrielmbmb_SmolLM-1.7B-Instruct-IFEval_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/gabrielmbmb/SmolLM-1.7B-Instruct-IFEval" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gabrielmbmb/SmolLM-1.7B-Instruct-IFEval</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/gabrielmbmb__SmolLM-1.7B-Instruct-IFEval-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gabrielmbmb/SmolLM-1.7B-Instruct-IFEval
ac5d711adc05ccfe1b1b912d5561d98f6afeeb40
5.222836
0
1
false
false
false
true
0.134745
0.230586
23.058596
0.313843
4.501675
0
0
0.253356
0.447427
0.33276
1.595052
0.115608
1.734264
false
false
2024-10-01
2024-10-11
2
HuggingFaceTB/SmolLM-1.7B
gaverfraxz_Meta-Llama-3.1-8B-Instruct-HalfAbliterated-DELLA_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/gaverfraxz/Meta-Llama-3.1-8B-Instruct-HalfAbliterated-DELLA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gaverfraxz/Meta-Llama-3.1-8B-Instruct-HalfAbliterated-DELLA</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/gaverfraxz__Meta-Llama-3.1-8B-Instruct-HalfAbliterated-DELLA-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gaverfraxz/Meta-Llama-3.1-8B-Instruct-HalfAbliterated-DELLA
6b0271a98b8875a65972ed54b0d636d8236ea60b
11.919582
llama3.1
0
8
true
false
false
false
1.345674
0.400946
40.094616
0.398484
15.276579
0.008308
0.830816
0.284396
4.58613
0.365042
3.463542
0.165392
7.26581
true
false
2024-09-22
2024-09-23
1
gaverfraxz/Meta-Llama-3.1-8B-Instruct-HalfAbliterated-DELLA (Merge)
gaverfraxz_Meta-Llama-3.1-8B-Instruct-HalfAbliterated-TIES_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/gaverfraxz/Meta-Llama-3.1-8B-Instruct-HalfAbliterated-TIES" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gaverfraxz/Meta-Llama-3.1-8B-Instruct-HalfAbliterated-TIES</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/gaverfraxz__Meta-Llama-3.1-8B-Instruct-HalfAbliterated-TIES-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gaverfraxz/Meta-Llama-3.1-8B-Instruct-HalfAbliterated-TIES
80569e49b5aba960a5cd91281dd9eef92aeff9a3
20.986454
llama3.1
1
8
true
false
false
true
0.961357
0.455051
45.505149
0.504366
28.914235
0.129154
12.915408
0.266779
2.237136
0.37375
6.585417
0.367852
29.761377
true
false
2024-09-19
2024-09-19
1
gaverfraxz/Meta-Llama-3.1-8B-Instruct-HalfAbliterated-TIES (Merge)
gbueno86_Brinebreath-Llama-3.1-70B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/gbueno86/Brinebreath-Llama-3.1-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gbueno86/Brinebreath-Llama-3.1-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/gbueno86__Brinebreath-Llama-3.1-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gbueno86/Brinebreath-Llama-3.1-70B
c508ecf356167e8c498c6fa3937ba30a82208983
36.292756
llama3.1
2
70
true
false
false
true
10.559754
0.553295
55.329526
0.688056
55.463618
0.299849
29.984894
0.346477
12.863535
0.454063
17.491146
0.519614
46.623818
true
false
2024-08-23
2024-08-29
1
gbueno86/Brinebreath-Llama-3.1-70B (Merge)
gbueno86_Meta-LLama-3-Cat-Smaug-LLama-70b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/gbueno86/Meta-LLama-3-Cat-Smaug-LLama-70b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gbueno86/Meta-LLama-3-Cat-Smaug-LLama-70b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/gbueno86__Meta-LLama-3-Cat-Smaug-LLama-70b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gbueno86/Meta-LLama-3-Cat-Smaug-LLama-70b
2d73b7e1c7157df482555944d6a6b1362bc6c3c5
38.268137
llama3
1
70
true
false
false
true
10.902293
0.807185
80.718494
0.667431
51.508386
0.268127
26.812689
0.327181
10.290828
0.436823
15.002865
0.50748
45.275561
true
false
2024-05-24
2024-06-27
1
gbueno86/Meta-LLama-3-Cat-Smaug-LLama-70b (Merge)
ghost-x_ghost-8b-beta-1608_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ghost-x/ghost-8b-beta-1608" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ghost-x/ghost-8b-beta-1608</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ghost-x__ghost-8b-beta-1608-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ghost-x/ghost-8b-beta-1608
6d1b3853aab774af5a4db21ff9d5764918fb48f5
15.103135
other
29
8
true
false
false
true
0.848931
0.427274
42.727408
0.451655
23.463964
0.01284
1.283988
0.258389
1.118568
0.351583
1.58125
0.283993
20.443632
false
false
2024-08-18
2024-09-17
1
ghost-x/ghost-8b-beta
glaiveai_Reflection-Llama-3.1-70B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/glaiveai/Reflection-Llama-3.1-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">glaiveai/Reflection-Llama-3.1-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/glaiveai__Reflection-Llama-3.1-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
glaiveai/Reflection-Llama-3.1-70B
086bd2658e00345808b31758ebb8f7e2c6d9897c
29.924816
10
69
true
false
false
true
25.243776
0.599057
59.905717
0.568101
37.960486
0
0
0.314597
8.612975
0.438031
13.720573
0.634142
59.349143
false
false
2024-09-19
2024-10-07
0
glaiveai/Reflection-Llama-3.1-70B
gmonsoon_SahabatAI-Llama-11B-Test_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/gmonsoon/SahabatAI-Llama-11B-Test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gmonsoon/SahabatAI-Llama-11B-Test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/gmonsoon__SahabatAI-Llama-11B-Test-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gmonsoon/SahabatAI-Llama-11B-Test
f6340b95d6e6cf766b6de29d36ee0db373ef175b
16.164914
llama3
0
11
true
false
false
false
1.049972
0.337573
33.757319
0.472758
24.457264
0.024924
2.492447
0.281879
4.250559
0.400135
7.783594
0.318235
24.248301
false
false
2024-11-22
2024-11-23
1
gmonsoon/SahabatAI-Llama-11B-Test (Merge)
gmonsoon_SahabatAI-MediChatIndo-8B-v1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/gmonsoon/SahabatAI-MediChatIndo-8B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gmonsoon/SahabatAI-MediChatIndo-8B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/gmonsoon__SahabatAI-MediChatIndo-8B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gmonsoon/SahabatAI-MediChatIndo-8B-v1
2f7daa8eb5ad216ce9ebcd70dc77e5b44fb977b0
17.299865
llama3
0
8
true
false
false
true
0.676722
0.416283
41.628324
0.450883
23.6401
0.061934
6.193353
0.282718
4.362416
0.375396
4.557813
0.310755
23.417184
true
false
2024-11-19
2024-11-19
1
gmonsoon/SahabatAI-MediChatIndo-8B-v1 (Merge)
gmonsoon_SahabatAI-Rebase-8B-Test_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/gmonsoon/SahabatAI-Rebase-8B-Test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gmonsoon/SahabatAI-Rebase-8B-Test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/gmonsoon__SahabatAI-Rebase-8B-Test-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gmonsoon/SahabatAI-Rebase-8B-Test
aef1b4c94595f3ef110d3d69724828a2fb416b5d
23.516791
0
8
false
false
false
true
0.652155
0.515626
51.562632
0.522961
32.002221
0.114804
11.480363
0.287752
5.033557
0.413281
11.426823
0.366356
29.595154
false
false
2024-11-21
2024-11-23
1
gmonsoon/SahabatAI-Rebase-8B-Test (Merge)