eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
6 values
T
stringclasses
6 values
Weight type
stringclasses
2 values
Architecture
stringclasses
52 values
Model
stringlengths
355
689
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
1.03
52
Hub License
stringclasses
26 values
Hub ❤️
int64
0
5.9k
#Params (B)
int64
-1
140
Available on the hub
bool
2 classes
MoE
bool
2 classes
Flagged
bool
2 classes
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.03
107
IFEval Raw
float64
0
0.9
IFEval
float64
0
90
BBH Raw
float64
0.27
0.75
BBH
float64
0.81
63.5
MATH Lvl 5 Raw
float64
0
0.51
MATH Lvl 5
float64
0
50.7
GPQA Raw
float64
0.22
0.44
GPQA
float64
0
24.9
MUSR Raw
float64
0.29
0.6
MUSR
float64
0
38.5
MMLU-PRO Raw
float64
0.1
0.73
MMLU-PRO
float64
0
70
Merged
bool
2 classes
Official Providers
bool
2 classes
Upload To Hub Date
stringclasses
424 values
Submission Date
stringclasses
169 values
Generation
int64
0
10
Base Model
stringlengths
4
102
icefog72_IceCoffeeRP-7b_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/icefog72/IceCoffeeRP-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">icefog72/IceCoffeeRP-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/icefog72__IceCoffeeRP-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
icefog72/IceCoffeeRP-7b
131c0f7c0809a9d23b05b63cb550a586c3c7b372
20.343825
cc-by-nc-4.0
6
7
true
false
false
false
0.571745
0.495917
49.59175
0.488872
29.398107
0.054381
5.438066
0.285235
4.697987
0.415979
10.997396
0.297457
21.939642
true
false
2024-04-26
2024-06-26
0
icefog72/IceCoffeeRP-7b
icefog72_IceDrinkByFrankensteinV3RP_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/icefog72/IceDrinkByFrankensteinV3RP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">icefog72/IceDrinkByFrankensteinV3RP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/icefog72__IceDrinkByFrankensteinV3RP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
icefog72/IceDrinkByFrankensteinV3RP
a4d2eb422867ea28860ad3b983b93bc97ca91719
19.830522
cc-by-nc-4.0
0
7
true
false
false
false
1.755961
0.497491
49.74911
0.483252
28.845881
0.052115
5.21148
0.261745
1.565996
0.425313
12.197396
0.292719
21.413268
true
false
2024-09-23
2024-10-03
0
icefog72/IceDrinkByFrankensteinV3RP
icefog72_IceDrinkNameGoesHereRP-7b-Model_Stock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/icefog72/IceDrinkNameGoesHereRP-7b-Model_Stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">icefog72/IceDrinkNameGoesHereRP-7b-Model_Stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/icefog72__IceDrinkNameGoesHereRP-7b-Model_Stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
icefog72/IceDrinkNameGoesHereRP-7b-Model_Stock
78f7625f85c3cb150565ebb68c3f8d47d48325c8
18.619303
cc-by-nc-4.0
2
7
true
false
false
false
1.693099
0.496842
49.684171
0.465786
26.224654
0.03852
3.851964
0.268456
2.46085
0.40674
9.309115
0.281666
20.185062
true
false
2024-09-14
2024-09-24
0
icefog72/IceDrinkNameGoesHereRP-7b-Model_Stock
icefog72_IceDrinkNameNotFoundRP-7b-Model_Stock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/icefog72/IceDrinkNameNotFoundRP-7b-Model_Stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">icefog72/IceDrinkNameNotFoundRP-7b-Model_Stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/icefog72__IceDrinkNameNotFoundRP-7b-Model_Stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
icefog72/IceDrinkNameNotFoundRP-7b-Model_Stock
35db2bf9e6812c5819378be68f94159e962fd1cb
21.39385
cc-by-nc-4.0
1
7
true
false
false
false
0.601449
0.513003
51.300328
0.502625
30.668251
0.061178
6.117825
0.277685
3.691275
0.437188
13.648438
0.306433
22.936983
true
false
2024-09-15
2024-09-15
0
icefog72/IceDrinkNameNotFoundRP-7b-Model_Stock
icefog72_IceDrunkCherryRP-7b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/icefog72/IceDrunkCherryRP-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">icefog72/IceDrunkCherryRP-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/icefog72__IceDrunkCherryRP-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
icefog72/IceDrunkCherryRP-7b
160b01e50d9c9441886f6cf987a3495bd8fa1c49
20.284231
cc-by-nc-4.0
1
7
true
false
false
false
0.54493
0.489823
48.982256
0.484663
28.24109
0.061934
6.193353
0.276846
3.579418
0.429188
12.381771
0.300947
22.327497
true
false
2024-09-24
2024-09-24
0
icefog72/IceDrunkCherryRP-7b
icefog72_IceDrunkenCherryRP-7b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/icefog72/IceDrunkenCherryRP-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">icefog72/IceDrunkenCherryRP-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/icefog72__IceDrunkenCherryRP-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
icefog72/IceDrunkenCherryRP-7b
7a0d428a84bbef60a5287e838551dc56230b291f
21.76755
cc-by-nc-4.0
6
7
true
false
false
false
0.416585
0.476259
47.625855
0.509309
31.50524
0.062689
6.268882
0.307047
7.606264
0.444594
14.274219
0.309924
23.324837
true
false
2024-11-21
2024-11-25
1
icefog72/IceDrunkenCherryRP-7b (Merge)
icefog72_IceEspressoRPv2-7b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/icefog72/IceEspressoRPv2-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">icefog72/IceEspressoRPv2-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/icefog72__IceEspressoRPv2-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
icefog72/IceEspressoRPv2-7b
d71a4c2ae25c063fd4c3d3df039908c648a8bab4
21.3401
1
7
false
false
false
false
0.572534
0.497716
49.771606
0.505489
31.303239
0.060423
6.042296
0.28943
5.257271
0.433062
12.766146
0.3061
22.900044
false
false
2024-09-11
2024-09-11
1
icefog72/IceEspressoRPv2-7b (Merge)
icefog72_IceLemonTeaRP-32k-7b_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/icefog72/IceLemonTeaRP-32k-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">icefog72/IceLemonTeaRP-32k-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/icefog72__IceLemonTeaRP-32k-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
icefog72/IceLemonTeaRP-32k-7b
7ea0bdf873c535b73ca20db46db0799bac433662
21.359848
cc-by-nc-4.0
24
7
true
false
false
false
0.571435
0.521221
52.122147
0.499739
30.13578
0.053625
5.362538
0.290268
5.369128
0.429031
12.195573
0.306765
22.973921
true
false
2024-04-03
2024-07-27
1
icefog72/IceLemonTeaRP-32k-7b (Merge)
icefog72_IceMartiniRP-7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/icefog72/IceMartiniRP-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">icefog72/IceMartiniRP-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/icefog72__IceMartiniRP-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
icefog72/IceMartiniRP-7b
e5be38a55d2d9877fbb61cffc7f48402ac0193fc
21.183767
cc-by-nc-4.0
2
7
true
false
false
false
0.569438
0.50446
50.446039
0.497242
29.685368
0.068731
6.873112
0.279362
3.914989
0.43449
13.144531
0.307347
23.038564
false
false
2024-09-24
2024-09-24
0
icefog72/IceMartiniRP-7b
icefog72_IceSakeRP-7b_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/icefog72/IceSakeRP-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">icefog72/IceSakeRP-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/icefog72__IceSakeRP-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
icefog72/IceSakeRP-7b
3b6b00bc48cd99e9b28e5aa8293dc987a0cf069a
21.576224
cc-by-nc-4.0
15
7
true
false
false
false
1.292449
0.522795
52.279507
0.511929
31.651255
0.064199
6.41994
0.285235
4.697987
0.413
10.225
0.317653
24.183658
true
false
2024-07-07
2024-08-22
1
icefog72/IceSakeRP-7b (Merge)
icefog72_IceSakeV4RP-7b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/icefog72/IceSakeV4RP-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">icefog72/IceSakeV4RP-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/icefog72__IceSakeV4RP-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
icefog72/IceSakeV4RP-7b
e8cb50b78918149c7d1bf663bcb807e7bfac3eed
20.052251
0
7
false
false
false
false
0.548079
0.463419
46.341928
0.492956
29.234193
0.055891
5.589124
0.294463
5.928412
0.408198
9.858073
0.310256
23.361776
false
false
2024-06-26
0
Removed
icefog72_IceSakeV6RP-7b_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/icefog72/IceSakeV6RP-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">icefog72/IceSakeV6RP-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/icefog72__IceSakeV6RP-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
icefog72/IceSakeV6RP-7b
6838e68d35d037b0ef9b04a9de1ebc8ab508cd45
21.227054
cc-by-nc-4.0
1
7
true
false
false
false
0.555443
0.503261
50.326135
0.497603
30.391495
0.062689
6.268882
0.291107
5.480984
0.42001
11.634635
0.309342
23.260195
true
false
2024-06-26
2024-06-26
0
icefog72/IceSakeV6RP-7b
icefog72_IceSakeV8RP-7b_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/icefog72/IceSakeV8RP-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">icefog72/IceSakeV8RP-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/icefog72__IceSakeV8RP-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
icefog72/IceSakeV8RP-7b
0f8f73fe356583e561479c689aa6597435327f4e
21.765015
cc-by-nc-4.0
1
7
true
false
false
true
0.648285
0.608574
60.857414
0.488471
28.966258
0.064199
6.41994
0.276007
3.467562
0.399271
8.542188
0.301031
22.336732
true
false
2024-06-26
2024-06-26
0
icefog72/IceSakeV8RP-7b
icefog72_IceTea21EnergyDrinkRPV13-DPOv3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/icefog72/IceTea21EnergyDrinkRPV13-DPOv3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">icefog72/IceTea21EnergyDrinkRPV13-DPOv3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/icefog72__IceTea21EnergyDrinkRPV13-DPOv3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
icefog72/IceTea21EnergyDrinkRPV13-DPOv3
2d4b4fd596ff0f6706a5752198e59da6ffc08067
21.684259
2
7
false
false
false
false
0.579942
0.526342
52.634233
0.501959
30.612734
0.058912
5.891239
0.283557
4.474273
0.437188
13.648438
0.305602
22.844637
false
false
2024-09-05
2024-09-06
1
icefog72/IceTea21EnergyDrinkRPV13-DPOv3 (Merge)
icefog72_IceTea21EnergyDrinkRPV13-DPOv3.5_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/icefog72/IceTea21EnergyDrinkRPV13-DPOv3.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">icefog72/IceTea21EnergyDrinkRPV13-DPOv3.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/icefog72__IceTea21EnergyDrinkRPV13-DPOv3.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
icefog72/IceTea21EnergyDrinkRPV13-DPOv3.5
0b0b0864347c3fad2b4d3e102f2f9839d20e296c
17.308799
0
7
false
false
false
false
0.500729
0.4871
48.709978
0.439966
22.573226
0.035498
3.549849
0.284396
4.58613
0.396417
7.785417
0.249834
16.648197
false
false
2024-09-25
0
Removed
ifable_gemma-2-Ifable-9B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/ifable/gemma-2-Ifable-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ifable/gemma-2-Ifable-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ifable__gemma-2-Ifable-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ifable/gemma-2-Ifable-9B
d3dbde4efb93ea0a4f247de82541479de6b03160
22.888691
gemma
37
9
true
false
false
false
4.317604
0.298429
29.842928
0.586612
41.032645
0.098943
9.89426
0.341443
12.192394
0.40525
8.522917
0.422623
35.847001
false
false
2024-09-10
2024-09-25
0
ifable/gemma-2-Ifable-9B
informatiker_Qwen2-7B-Instruct-abliterated_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/informatiker/Qwen2-7B-Instruct-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">informatiker/Qwen2-7B-Instruct-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/informatiker__Qwen2-7B-Instruct-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
informatiker/Qwen2-7B-Instruct-abliterated
7577d60acfe4544d5ab303f0a4d69a9fcb9cf1aa
25.121604
5
7
false
false
false
true
1.060605
0.582171
58.217086
0.553427
37.795723
0.09139
9.138973
0.301174
6.823266
0.388792
6.832292
0.387301
31.922281
false
false
2024-07-10
2024-09-15
0
informatiker/Qwen2-7B-Instruct-abliterated
instruction-pretrain_InstructLM-500M_float16
float16
🟢 pretrained
🟢
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/instruction-pretrain/InstructLM-500M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">instruction-pretrain/InstructLM-500M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/instruction-pretrain__InstructLM-500M-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
instruction-pretrain/InstructLM-500M
e9d33823c76303dfaff6a8397a8b70d0118ea350
2.85435
apache-2.0
34
0
true
false
false
false
0.245792
0.102766
10.276622
0.294087
2.317054
0
0
0.256711
0.894855
0.352823
2.069531
0.114112
1.568041
false
false
2024-06-18
2024-06-27
0
instruction-pretrain/InstructLM-500M
internlm_internlm2-1_8b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
InternLM2ForCausalLM
<a target="_blank" href="https://huggingface.co/internlm/internlm2-1_8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">internlm/internlm2-1_8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/internlm__internlm2-1_8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
internlm/internlm2-1_8b
c24f301c7374ad9f9b58d1ea80f68b5f57cbca13
8.597072
other
28
8
true
false
false
false
0.663646
0.21977
21.977021
0.387973
13.633858
0.012085
1.208459
0.248322
0
0.381281
8.226823
0.158826
6.536274
false
true
2024-01-30
2024-06-12
0
internlm/internlm2-1_8b
internlm_internlm2-chat-1_8b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
InternLM2ForCausalLM
<a target="_blank" href="https://huggingface.co/internlm/internlm2-chat-1_8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">internlm/internlm2-chat-1_8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/internlm__internlm2-chat-1_8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
internlm/internlm2-chat-1_8b
4e226eeb354499f4d34ef4c27f6939f377475cc1
10.553684
other
30
1
true
false
false
true
0.596423
0.238655
23.865455
0.445227
20.672357
0.02719
2.719033
0.26594
2.12528
0.363052
4.614844
0.183926
9.325133
false
true
2024-01-30
2024-06-12
0
internlm/internlm2-chat-1_8b
internlm_internlm2_5-1_8b-chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
InternLM2ForCausalLM
<a target="_blank" href="https://huggingface.co/internlm/internlm2_5-1_8b-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">internlm/internlm2_5-1_8b-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/internlm__internlm2_5-1_8b-chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
internlm/internlm2_5-1_8b-chat
4426f00b854561fa60d555d2b628064b56bcb758
12.106338
other
24
1
true
false
false
true
0.771666
0.384909
38.490871
0.448893
21.030927
0
0
0.290268
5.369128
0.359396
4.424479
0.129904
3.322621
false
true
2024-07-30
2024-08-07
0
internlm/internlm2_5-1_8b-chat
internlm_internlm2_5-20b-chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
InternLM2ForCausalLM
<a target="_blank" href="https://huggingface.co/internlm/internlm2_5-20b-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">internlm/internlm2_5-20b-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/internlm__internlm2_5-20b-chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
internlm/internlm2_5-20b-chat
ef17bde929761255fee76d95e2c25969ccd93b0d
32.082013
other
85
19
true
false
false
true
3.732708
0.700998
70.09978
0.747358
62.832459
0
0
0.321309
9.50783
0.455823
16.744531
0.399767
33.307476
false
true
2024-07-30
2024-08-12
0
internlm/internlm2_5-20b-chat
internlm_internlm2_5-7b-chat_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
InternLM2ForCausalLM
<a target="_blank" href="https://huggingface.co/internlm/internlm2_5-7b-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">internlm/internlm2_5-7b-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/internlm__internlm2_5-7b-chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
internlm/internlm2_5-7b-chat
bebb00121ee105b823647c3ba2b1e152652edc33
30.576856
other
184
7
true
false
false
true
1.453382
0.61402
61.401969
0.710774
57.673648
0.089879
8.987915
0.329698
10.626398
0.4415
14.354167
0.373753
30.417036
false
true
2024-06-27
2024-07-03
0
internlm/internlm2_5-7b-chat
intervitens_mini-magnum-12b-v1.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/intervitens/mini-magnum-12b-v1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">intervitens/mini-magnum-12b-v1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/intervitens__mini-magnum-12b-v1.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
intervitens/mini-magnum-12b-v1.1
3b19e12711d3f4d9b81fdeb73860e9019ebe2404
20.638504
apache-2.0
72
12
true
false
false
true
2.230948
0.515551
51.555096
0.50618
29.731187
0.03852
3.851964
0.288591
5.145414
0.400448
8.089323
0.329122
25.458038
false
false
2024-07-24
2024-07-25
0
intervitens/mini-magnum-12b-v1.1
invalid-coder_Sakura-SOLAR-Instruct-CarbonVillain-en-10.7B-v2-slerp_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/invalid-coder/Sakura-SOLAR-Instruct-CarbonVillain-en-10.7B-v2-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">invalid-coder/Sakura-SOLAR-Instruct-CarbonVillain-en-10.7B-v2-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/invalid-coder__Sakura-SOLAR-Instruct-CarbonVillain-en-10.7B-v2-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
invalid-coder/Sakura-SOLAR-Instruct-CarbonVillain-en-10.7B-v2-slerp
39a1c76ddb5fa3a82c5b4071121d2e4866a25300
19.529851
apache-2.0
0
10
true
false
false
true
0.764251
0.455476
45.547592
0.515844
31.635375
0
0
0.305369
7.38255
0.39924
8.771615
0.314578
23.841977
true
false
2024-01-10
2024-07-25
0
invalid-coder/Sakura-SOLAR-Instruct-CarbonVillain-en-10.7B-v2-slerp
invisietch_EtherealRainbow-v0.2-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/invisietch/EtherealRainbow-v0.2-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">invisietch/EtherealRainbow-v0.2-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/invisietch__EtherealRainbow-v0.2-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
invisietch/EtherealRainbow-v0.2-8B
46611fbb6aac0f33478c8401488d3ec7763c04d0
20.156929
llama3
6
8
true
false
false
false
0.871417
0.39033
39.032988
0.510204
30.283791
0.085347
8.534743
0.302852
7.04698
0.382677
6.567969
0.365276
29.475103
true
false
2024-06-12
2024-07-01
0
invisietch/EtherealRainbow-v0.2-8B
invisietch_EtherealRainbow-v0.3-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/invisietch/EtherealRainbow-v0.3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">invisietch/EtherealRainbow-v0.3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/invisietch__EtherealRainbow-v0.3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
invisietch/EtherealRainbow-v0.3-8B
c986c4ca5a5b8474820a59d3e911a431cf26938d
19.778644
llama3
13
8
true
false
false
false
1.281267
0.368223
36.822298
0.509676
30.080258
0.075529
7.55287
0.30453
7.270694
0.390396
7.766146
0.362616
29.179595
true
false
2024-06-19
2024-07-01
0
invisietch/EtherealRainbow-v0.3-8B
invisietch_MiS-Firefly-v0.2-22B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/invisietch/MiS-Firefly-v0.2-22B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">invisietch/MiS-Firefly-v0.2-22B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/invisietch__MiS-Firefly-v0.2-22B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
invisietch/MiS-Firefly-v0.2-22B
02dd13deefc5ff516edb59070ad66bd9f2831f4c
26.653481
other
6
22
true
false
false
true
1.025278
0.537108
53.710821
0.551352
36.082656
0.159366
15.936556
0.30453
7.270694
0.469375
17.805208
0.362035
29.114953
false
false
2024-11-06
2024-11-07
0
invisietch/MiS-Firefly-v0.2-22B
invisietch_Nimbus-Miqu-v0.1-70B_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/invisietch/Nimbus-Miqu-v0.1-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">invisietch/Nimbus-Miqu-v0.1-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/invisietch__Nimbus-Miqu-v0.1-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
invisietch/Nimbus-Miqu-v0.1-70B
3209583a0849383daf8faa7b819f29726b8806cf
24.782876
unknown
19
68
true
false
false
false
7.143677
0.464668
46.466819
0.601031
43.450995
0.058912
5.891239
0.338926
11.856823
0.413312
9.330729
0.385306
31.70065
true
false
2024-06-30
2024-07-03
0
invisietch/Nimbus-Miqu-v0.1-70B
jaredjoss_pythia-410m-roberta-lr_8e7-kl_01-steps_12000-rlhf-model_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/jaredjoss/pythia-410m-roberta-lr_8e7-kl_01-steps_12000-rlhf-model" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">jaredjoss/pythia-410m-roberta-lr_8e7-kl_01-steps_12000-rlhf-model</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/jaredjoss__pythia-410m-roberta-lr_8e7-kl_01-steps_12000-rlhf-model-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
jaredjoss/pythia-410m-roberta-lr_8e7-kl_01-steps_12000-rlhf-model
048bc8edfc32fdcf6d957332d5f4c0d4e5950746
3.81661
mit
0
0
true
false
false
true
0.233064
0.157222
15.722173
0.286344
1.820374
0
0
0.259228
1.230425
0.360698
2.253906
0.116855
1.872784
false
false
2024-04-23
2024-08-06
0
jaredjoss/pythia-410m-roberta-lr_8e7-kl_01-steps_12000-rlhf-model
jebcarter_psyonic-cetacean-20B_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/jebcarter/psyonic-cetacean-20B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">jebcarter/psyonic-cetacean-20B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/jebcarter__psyonic-cetacean-20B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
jebcarter/psyonic-cetacean-20B
298d2086a949d53af06096d229f64f4719261698
15.898966
other
37
19
true
false
false
false
2.144948
0.254366
25.436619
0.490739
27.84306
0.011329
1.132931
0.27349
3.131991
0.466115
16.897656
0.288564
20.951537
false
false
2023-11-28
2024-06-30
0
jebcarter/psyonic-cetacean-20B
jeffmeloy_Qwen-7B-nerd-uncensored-v1.0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/jeffmeloy/Qwen-7B-nerd-uncensored-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">jeffmeloy/Qwen-7B-nerd-uncensored-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/jeffmeloy__Qwen-7B-nerd-uncensored-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
jeffmeloy/Qwen-7B-nerd-uncensored-v1.0
245a9a038ea9cfdc214a5e24a2e7ff9362f56b4a
31.178621
apache-2.0
3
7
true
false
false
false
0.726911
0.615119
61.5119
0.542108
34.18632
0.246979
24.697885
0.32802
10.402685
0.479292
18.911458
0.436253
37.36148
false
false
2024-10-30
2024-10-30
1
jeffmeloy/Qwen-7B-nerd-uncensored-v1.0 (Merge)
jeffmeloy_Qwen2.5-7B-minperplexity-2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/jeffmeloy/Qwen2.5-7B-minperplexity-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">jeffmeloy/Qwen2.5-7B-minperplexity-2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/jeffmeloy__Qwen2.5-7B-minperplexity-2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
jeffmeloy/Qwen2.5-7B-minperplexity-2
d7ba08c49f9e13e65b0abbf8539037f0712233c2
25.031389
apache-2.0
0
7
true
false
false
true
0.978281
0.509731
50.973085
0.552391
36.89009
0.000755
0.075529
0.311242
8.165548
0.462458
16.907292
0.434591
37.176788
false
false
2024-12-03
2024-12-03
1
jeffmeloy/Qwen2.5-7B-minperplexity-2 (Merge)
jeffmeloy_Qwen2.5-7B-nerd-uncensored-v0.9_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/jeffmeloy/Qwen2.5-7B-nerd-uncensored-v0.9" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">jeffmeloy/Qwen2.5-7B-nerd-uncensored-v0.9</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/jeffmeloy__Qwen2.5-7B-nerd-uncensored-v0.9-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
jeffmeloy/Qwen2.5-7B-nerd-uncensored-v0.9
7eb2a19e13fb32c1bab751eb89fed33f6c66b4e6
31.166804
apache-2.0
0
7
true
false
false
false
0.723031
0.604827
60.482741
0.54697
34.791599
0.250755
25.075529
0.322987
9.731544
0.48199
19.548698
0.436336
37.370715
false
false
2024-11-02
2024-11-13
1
jeffmeloy/Qwen2.5-7B-nerd-uncensored-v0.9 (Merge)
jeffmeloy_Qwen2.5-7B-nerd-uncensored-v1.0_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/jeffmeloy__Qwen2.5-7B-nerd-uncensored-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.0
8f478661c654990358904e2159252d5c5236b80f
28.36389
apache-2.0
3
7
true
false
false
true
0.731588
0.769516
76.9516
0.541763
34.737157
0.001511
0.151057
0.290268
5.369128
0.455115
16.822656
0.425366
36.151743
false
false
2024-10-30
2024-11-14
1
jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.0 (Merge)
jeffmeloy_Qwen2.5-7B-nerd-uncensored-v1.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/jeffmeloy__Qwen2.5-7B-nerd-uncensored-v1.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.1
e757ba9e4c1a5a43ba3a3e98b44ebbbfe7bf831a
23.70241
apache-2.0
0
7
true
false
false
true
0.684101
0.66263
66.26296
0.486402
26.661152
0.074018
7.401813
0.286913
4.9217
0.384292
5.303125
0.384973
31.663712
false
false
2024-11-01
2024-11-09
1
jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.1 (Merge)
jeffmeloy_Qwen2.5-7B-nerd-uncensored-v1.2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/jeffmeloy__Qwen2.5-7B-nerd-uncensored-v1.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.2
8ba84532e3eea17c821f96f3e80bec7c9d8b3799
23.146833
apache-2.0
0
7
true
false
false
false
0.625914
0.496467
49.646715
0.494593
27.660912
0.10574
10.574018
0.303691
7.158837
0.41725
10.85625
0.396858
32.984264
false
false
2024-11-01
2024-11-09
1
jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.2 (Merge)
jeffmeloy_Qwen2.5-7B-nerd-uncensored-v1.3_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/jeffmeloy__Qwen2.5-7B-nerd-uncensored-v1.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.3
db61c49ae128777c4b893ab544975df349052d66
23.873165
apache-2.0
0
7
true
false
false
false
0.61535
0.499515
49.951462
0.502606
28.900263
0.111782
11.178248
0.312919
8.389262
0.41874
11.309115
0.401596
33.510638
false
false
2024-11-03
2024-11-09
1
jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.3 (Merge)
jeffmeloy_Qwen2.5-7B-nerd-uncensored-v1.4_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/jeffmeloy__Qwen2.5-7B-nerd-uncensored-v1.4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.4
7da97922062bae96d0e694fbd3a5f1c06cf375b6
30.823334
apache-2.0
1
7
true
false
false
false
0.661308
0.607875
60.787488
0.546708
34.856018
0.236405
23.640483
0.323826
9.8434
0.471385
17.823177
0.441905
37.989436
false
false
2024-11-16
2024-11-17
1
jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.4 (Merge)
jeffmeloy_Qwen2.5-7B-nerd-uncensored-v1.5_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/jeffmeloy__Qwen2.5-7B-nerd-uncensored-v1.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.5
055cf43cab9027de7e548728dc231afea6a3dfd1
30.953777
apache-2.0
2
7
true
false
false
false
0.684101
0.565035
56.503522
0.55226
35.925321
0.22281
22.280967
0.327181
10.290828
0.498208
22.409375
0.444814
38.312648
false
false
2024-11-16
2024-11-17
1
jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.5 (Merge)
jeffmeloy_Qwen2.5-7B-nerd-uncensored-v1.7_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.7</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/jeffmeloy__Qwen2.5-7B-nerd-uncensored-v1.7-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.7
a27917d12ac64d91a86afee9953bce6d1a9b6424
28.312594
apache-2.0
1
7
true
false
false
false
0.66487
0.420155
42.015519
0.539172
33.833511
0.269637
26.963746
0.323826
9.8434
0.484844
20.772135
0.428025
36.447252
false
false
2024-11-16
2024-11-17
1
jeffmeloy/Qwen2.5-7B-nerd-uncensored-v1.7 (Merge)
jeffmeloy_jeffmeloy_Qwen2.5-7B-minperplexity-1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/jeffmeloy/jeffmeloy_Qwen2.5-7B-minperplexity-1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">jeffmeloy/jeffmeloy_Qwen2.5-7B-minperplexity-1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/jeffmeloy__jeffmeloy_Qwen2.5-7B-minperplexity-1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
jeffmeloy/jeffmeloy_Qwen2.5-7B-minperplexity-1
8ae87cfb63b5afcb0a8aca3b7ba9b8044cc3ba0e
27.0856
apache-2.0
1
7
true
false
false
false
0.919796
0.375716
37.571643
0.558235
37.821507
0.268127
26.812689
0.332215
10.961969
0.429031
11.928906
0.436752
37.416888
false
false
2024-11-27
2024-11-27
1
jeffmeloy/jeffmeloy_Qwen2.5-7B-minperplexity-1 (Merge)
jeonsworld_CarbonVillain-en-10.7B-v4_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/jeonsworld/CarbonVillain-en-10.7B-v4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">jeonsworld/CarbonVillain-en-10.7B-v4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/jeonsworld__CarbonVillain-en-10.7B-v4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
jeonsworld/CarbonVillain-en-10.7B-v4
57d6ad4d705d336aba228356683d9f221507440a
19.548213
cc-by-nc-sa-4.0
6
10
true
false
false
true
0.769731
0.457924
45.792386
0.516796
31.80564
0
0
0.306208
7.494407
0.396542
8.401042
0.314162
23.795804
true
false
2023-12-30
2024-07-25
0
jeonsworld/CarbonVillain-en-10.7B-v4
jiangxinyang-shanda_Homer-LLama3-8B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/jiangxinyang-shanda/Homer-LLama3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">jiangxinyang-shanda/Homer-LLama3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/jiangxinyang-shanda__Homer-LLama3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
jiangxinyang-shanda/Homer-LLama3-8B
550cdaea5feac5df9b0984bda14d00570daa4437
18.896193
0
8
false
false
false
true
0.696426
0.399172
39.917197
0.517324
31.698975
0.024924
2.492447
0.29698
6.263982
0.405625
9.236458
0.313913
23.7681
false
false
2024-11-08
0
Removed
jieliu_Storm-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/jieliu/Storm-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">jieliu/Storm-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/jieliu__Storm-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
jieliu/Storm-7B
71edab8ee6c2578e428b0359158fb0d43133e989
19.789054
apache-2.0
40
7
true
false
false
false
0.611913
0.342419
34.241923
0.518729
32.330284
0.062689
6.268882
0.307886
7.718121
0.442896
14.628646
0.311918
23.546469
false
false
2024-04-25
2024-06-26
2
mistralai/Mistral-7B-v0.1
johnsutor_Llama-3-8B-Instruct_breadcrumbs-density-0.1-gamma-0.01_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.1-gamma-0.01" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.1-gamma-0.01</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_breadcrumbs-density-0.1-gamma-0.01-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.1-gamma-0.01
f4ebbf27d586e94c63f0a7293f565cbd947b824f
22.379186
apache-2.0
0
8
true
false
false
false
0.993518
0.427124
42.712447
0.503552
29.550014
0.041541
4.154079
0.322148
9.619687
0.46376
17.803385
0.37392
30.435505
true
false
2024-06-07
2024-06-26
1
johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.1-gamma-0.01 (Merge)
johnsutor_Llama-3-8B-Instruct_breadcrumbs-density-0.1-gamma-0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.1-gamma-0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.1-gamma-0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_breadcrumbs-density-0.1-gamma-0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.1-gamma-0.1
66c7330e9d04b13a68ea7dcf25bc0a71d144221a
21.408592
apache-2.0
0
8
true
false
false
false
0.80822
0.425326
42.532591
0.501885
28.607718
0.094411
9.441088
0.301174
6.823266
0.415021
10.777604
0.372424
30.269282
true
false
2024-06-07
2024-06-26
1
johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.1-gamma-0.1 (Merge)
johnsutor_Llama-3-8B-Instruct_breadcrumbs-density-0.3-gamma-0.01_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.3-gamma-0.01" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.3-gamma-0.01</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_breadcrumbs-density-0.3-gamma-0.01-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.3-gamma-0.01
4a432be239528ffc654955338982f1f32eb12901
20.103542
apache-2.0
0
8
true
false
false
false
1.036563
0.337748
33.774829
0.491714
28.135682
0
0
0.312081
8.277405
0.501771
22.288021
0.353308
28.145316
true
false
2024-06-07
2024-06-26
1
johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.3-gamma-0.01 (Merge)
johnsutor_Llama-3-8B-Instruct_breadcrumbs-density-0.3-gamma-0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.3-gamma-0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.3-gamma-0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_breadcrumbs-density-0.3-gamma-0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.3-gamma-0.1
d6f8ed8dc4b7f74b4312bc0d24aaac275c61958d
21.833221
apache-2.0
0
8
true
false
false
false
0.818299
0.427399
42.73993
0.512578
30.514943
0.080816
8.081571
0.308725
7.829978
0.422646
11.397396
0.37392
30.435505
true
false
2024-06-07
2024-06-26
1
johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.3-gamma-0.1 (Merge)
johnsutor_Llama-3-8B-Instruct_breadcrumbs-density-0.5-gamma-0.01_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.5-gamma-0.01" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.5-gamma-0.01</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_breadcrumbs-density-0.5-gamma-0.01-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.5-gamma-0.01
6ab1392c825907b08eff8fbed4c97a3e6e0d6dd9
19.384591
apache-2.0
0
8
true
false
false
false
1.027902
0.320362
32.036219
0.488358
27.665795
0
0
0.302013
6.935123
0.509771
23.621354
0.334441
26.049054
true
false
2024-06-07
2024-06-26
1
johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.5-gamma-0.01 (Merge)
johnsutor_Llama-3-8B-Instruct_breadcrumbs-density-0.5-gamma-0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.5-gamma-0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.5-gamma-0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_breadcrumbs-density-0.5-gamma-0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.5-gamma-0.1
a481edaceeaab34f4dc0e90c4d8ec0f72658bbdd
22.358557
apache-2.0
0
8
true
false
false
false
0.815127
0.439639
43.963905
0.514004
30.854731
0.079305
7.930514
0.307047
7.606264
0.439792
13.840625
0.369598
29.955304
true
false
2024-06-08
2024-06-26
1
johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.5-gamma-0.1 (Merge)
johnsutor_Llama-3-8B-Instruct_breadcrumbs-density-0.7-gamma-0.01_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.7-gamma-0.01" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.7-gamma-0.01</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_breadcrumbs-density-0.7-gamma-0.01-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.7-gamma-0.01
61f4b44fb917cdb46f0ade9f8fc2a382e0cf67af
18.442433
apache-2.0
0
8
true
false
false
false
1.024438
0.281444
28.144435
0.485433
27.164431
0
0
0.290268
5.369128
0.516313
24.472396
0.329538
25.504211
true
false
2024-06-08
2024-06-26
1
johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.7-gamma-0.01 (Merge)
johnsutor_Llama-3-8B-Instruct_breadcrumbs-density-0.7-gamma-0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.7-gamma-0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.7-gamma-0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_breadcrumbs-density-0.7-gamma-0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.7-gamma-0.1
139a9bccd0ffb284e670a181a5986a01b1420c6c
21.835686
apache-2.0
0
8
true
false
false
false
0.934131
0.430222
43.022181
0.51571
31.163508
0.066465
6.646526
0.307886
7.718121
0.433156
12.877865
0.366273
29.585919
true
false
2024-06-08
2024-06-26
1
johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.7-gamma-0.1 (Merge)
johnsutor_Llama-3-8B-Instruct_breadcrumbs-density-0.9-gamma-0.01_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.9-gamma-0.01" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.9-gamma-0.01</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_breadcrumbs-density-0.9-gamma-0.01-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.9-gamma-0.01
c88c6b65f751156e7bc04c738947387eb55747e9
18.483613
apache-2.0
0
8
true
false
false
false
1.019766
0.278996
27.89964
0.486115
27.224869
0
0
0.294463
5.928412
0.51501
24.242969
0.330452
25.605792
true
false
2024-06-08
2024-06-26
1
johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.9-gamma-0.01 (Merge)
johnsutor_Llama-3-8B-Instruct_breadcrumbs-density-0.9-gamma-0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.9-gamma-0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.9-gamma-0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_breadcrumbs-density-0.9-gamma-0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.9-gamma-0.1
818f7e586444b551200862fb234c39bd48d69ae8
21.994606
apache-2.0
0
8
true
false
false
false
0.935226
0.422278
42.227844
0.515376
31.124766
0.077795
7.779456
0.307886
7.718121
0.438427
13.670052
0.365027
29.4474
true
false
2024-06-08
2024-06-26
1
johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.9-gamma-0.1 (Merge)
johnsutor_Llama-3-8B-Instruct_breadcrumbs_ties-density-0.1-gamma-0.01_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.1-gamma-0.01" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.1-gamma-0.01</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_breadcrumbs_ties-density-0.1-gamma-0.01-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.1-gamma-0.01
861347cd643d396877d8e560367cf0717c671228
22.199406
apache-2.0
0
8
true
false
false
false
1.034864
0.435892
43.589232
0.504094
29.530013
0.049849
4.984894
0.310403
8.053691
0.453156
16.344531
0.376247
30.694075
true
false
2024-06-07
2024-06-26
1
johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.1-gamma-0.01 (Merge)
johnsutor_Llama-3-8B-Instruct_breadcrumbs_ties-density-0.1-gamma-0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.1-gamma-0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.1-gamma-0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_breadcrumbs_ties-density-0.1-gamma-0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.1-gamma-0.1
2647bc863e6ee686e7174366107eecbd4b37f62e
21.278226
apache-2.0
0
8
true
false
false
false
0.841316
0.420155
42.015519
0.501124
28.504906
0.096677
9.667674
0.300336
6.711409
0.415021
10.777604
0.36993
29.992243
true
false
2024-06-07
2024-06-26
1
johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.1-gamma-0.1 (Merge)
johnsutor_Llama-3-8B-Instruct_breadcrumbs_ties-density-0.3-gamma-0.01_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.3-gamma-0.01" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.3-gamma-0.01</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_breadcrumbs_ties-density-0.3-gamma-0.01-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.3-gamma-0.01
fa77530fe3723d7b15b06b88c3ca6110a8421742
20.410339
apache-2.0
0
8
true
false
false
false
1.055793
0.351787
35.178659
0.499852
29.136919
0.01284
1.283988
0.306208
7.494407
0.487104
20.354688
0.36112
29.013372
true
false
2024-06-07
2024-06-26
1
johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.3-gamma-0.01 (Merge)
johnsutor_Llama-3-8B-Instruct_breadcrumbs_ties-density-0.3-gamma-0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.3-gamma-0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.3-gamma-0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_breadcrumbs_ties-density-0.3-gamma-0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.3-gamma-0.1
6fe73aa7f9c5b59297739166e9557089d39e5fc7
21.773954
apache-2.0
0
8
true
false
false
false
0.851836
0.42038
42.038015
0.51073
30.244176
0.090634
9.063444
0.30453
7.270694
0.427854
11.915104
0.371011
30.112293
true
false
2024-06-07
2024-06-26
1
johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.3-gamma-0.1 (Merge)
johnsutor_Llama-3-8B-Instruct_breadcrumbs_ties-density-0.5-gamma-0.01_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.5-gamma-0.01" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.5-gamma-0.01</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_breadcrumbs_ties-density-0.5-gamma-0.01-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.5-gamma-0.01
a31f86b538ba8b2983620cc27a741bc9a81a7e2f
20.086695
apache-2.0
0
8
true
false
false
false
0.952685
0.345417
34.541683
0.498383
29.320608
0.012085
1.208459
0.29698
6.263982
0.491135
21.058594
0.353142
28.126847
true
false
2024-06-07
2024-06-26
1
johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.5-gamma-0.01 (Merge)
johnsutor_Llama-3-8B-Instruct_breadcrumbs_ties-density-0.5-gamma-0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.5-gamma-0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.5-gamma-0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_breadcrumbs_ties-density-0.5-gamma-0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.5-gamma-0.1
f9d5bab1c1d0d6890e89b513225d13f68a1c6d75
21.442231
apache-2.0
0
8
true
false
false
false
0.793357
0.409164
40.916435
0.513666
30.693077
0.080816
8.081571
0.295302
6.040268
0.435698
13.26224
0.366938
29.659796
true
false
2024-06-07
2024-06-26
1
johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.5-gamma-0.1 (Merge)
johnsutor_Llama-3-8B-Instruct_breadcrumbs_ties-density-0.7-gamma-0.01_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.7-gamma-0.01" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.7-gamma-0.01</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_breadcrumbs_ties-density-0.7-gamma-0.01-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.7-gamma-0.01
d30c75506feaec957dc73bc5c040159c310ecf4c
19.149612
apache-2.0
0
8
true
false
false
false
0.908481
0.290387
29.038728
0.496734
28.739266
0.006042
0.60423
0.299497
6.599553
0.499073
22.250781
0.348986
27.665115
true
false
2024-06-07
2024-06-26
1
johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.7-gamma-0.01 (Merge)
johnsutor_Llama-3-8B-Instruct_breadcrumbs_ties-density-0.7-gamma-0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.7-gamma-0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.7-gamma-0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_breadcrumbs_ties-density-0.7-gamma-0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.7-gamma-0.1
cd52bafe64e82d466d0bc590da5399f2299d24e1
21.626888
apache-2.0
0
8
true
false
false
false
0.806262
0.41988
41.988036
0.514691
31.007758
0.080816
8.081571
0.298658
6.487696
0.43576
13.136719
0.361536
29.059545
true
false
2024-06-07
2024-06-26
1
johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.7-gamma-0.1 (Merge)
johnsutor_Llama-3-8B-Instruct_breadcrumbs_ties-density-0.9-gamma-0.01_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.9-gamma-0.01" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.9-gamma-0.01</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_breadcrumbs_ties-density-0.9-gamma-0.01-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.9-gamma-0.01
4c30fdbe0708afefe50788ea640c3dfab294c77f
18.884377
apache-2.0
0
8
true
false
false
false
0.900631
0.291311
29.13115
0.49183
28.219373
0
0
0.300336
6.711409
0.497677
21.976302
0.345412
27.268026
true
false
2024-06-07
2024-06-26
1
johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.9-gamma-0.01 (Merge)
johnsutor_Llama-3-8B-Instruct_breadcrumbs_ties-density-0.9-gamma-0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.9-gamma-0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.9-gamma-0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_breadcrumbs_ties-density-0.9-gamma-0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.9-gamma-0.1
378a7cad3e34a1a8b11e77edd95b02ff0d228da2
21.374085
apache-2.0
0
8
true
false
false
false
0.929912
0.416233
41.623337
0.513861
30.841602
0.07855
7.854985
0.29698
6.263982
0.431729
12.499479
0.36245
29.161126
true
false
2024-06-07
2024-06-26
1
johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.9-gamma-0.1 (Merge)
johnsutor_Llama-3-8B-Instruct_dare_linear_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_dare_linear" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_dare_linear</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_dare_linear-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
johnsutor/Llama-3-8B-Instruct_dare_linear
abb81fd8fdc2ad32f65befcb7ae369c9837cd563
14.123523
apache-2.0
0
8
true
false
false
false
0.910256
0.21455
21.454962
0.428281
19.610999
0
0
0.296141
6.152125
0.497927
21.807552
0.241439
15.715499
true
false
2024-06-07
2024-06-26
1
johnsutor/Llama-3-8B-Instruct_dare_linear (Merge)
johnsutor_Llama-3-8B-Instruct_dare_ties-density-0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_dare_ties-density-0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_dare_ties-density-0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_dare_ties-density-0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
johnsutor/Llama-3-8B-Instruct_dare_ties-density-0.1
e7a3a3b955d945f53da8301b958f0b90a28a62d3
11.619907
apache-2.0
0
8
true
false
false
false
0.911134
0.189071
18.907056
0.411874
16.858917
0
0
0.271812
2.908277
0.465802
16.991927
0.226479
14.053265
true
false
2024-06-07
2024-06-26
1
johnsutor/Llama-3-8B-Instruct_dare_ties-density-0.1 (Merge)
johnsutor_Llama-3-8B-Instruct_dare_ties-density-0.3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_dare_ties-density-0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_dare_ties-density-0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_dare_ties-density-0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
johnsutor/Llama-3-8B-Instruct_dare_ties-density-0.3
6f966d14d7236f3da6d1ea9ce3bd9b20808e02a9
15.943771
apache-2.0
0
8
true
false
false
false
0.923385
0.211327
21.132706
0.455857
23.094936
0
0
0.29698
6.263982
0.506948
22.501823
0.304023
22.669178
true
false
2024-06-07
2024-06-26
1
johnsutor/Llama-3-8B-Instruct_dare_ties-density-0.3 (Merge)
johnsutor_Llama-3-8B-Instruct_dare_ties-density-0.7_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_dare_ties-density-0.7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_dare_ties-density-0.7</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_dare_ties-density-0.7-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
johnsutor/Llama-3-8B-Instruct_dare_ties-density-0.7
b14b5cd07feb749e42b0567b1e387b390bed033e
16.721678
apache-2.0
0
8
true
false
false
false
1.037712
0.203384
20.338369
0.472286
25.253546
0
0
0.303691
7.158837
0.51101
23.709635
0.314827
23.869681
true
false
2024-06-07
2024-06-26
1
johnsutor/Llama-3-8B-Instruct_dare_ties-density-0.7 (Merge)
johnsutor_Llama-3-8B-Instruct_dare_ties-density-0.9_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_dare_ties-density-0.9" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_dare_ties-density-0.9</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_dare_ties-density-0.9-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
johnsutor/Llama-3-8B-Instruct_dare_ties-density-0.9
17.284593
apache-2.0
0
8
true
false
false
false
1.32966
0.216073
21.607335
0.466396
24.687623
0
0
0.307886
7.718121
0.523042
25.880208
0.314328
23.814273
true
false
2024-06-07
2024-06-26
1
johnsutor/Llama-3-8B-Instruct_dare_ties-density-0.9 (Merge)
johnsutor_Llama-3-8B-Instruct_linear_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_linear" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_linear</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_linear-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
johnsutor/Llama-3-8B-Instruct_linear
7449157fbc2e8b02e5b6e8ad56b4b2bd7ea82e9d
21.358284
apache-2.0
0
8
true
false
false
false
0.826106
0.430821
43.082133
0.50315
28.778577
0.099698
9.969789
0.295302
6.040268
0.409719
10.148177
0.371177
30.130762
true
false
2024-06-07
2024-06-26
1
johnsutor/Llama-3-8B-Instruct_linear (Merge)
johnsutor_Llama-3-8B-Instruct_ties-density-0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_ties-density-0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_ties-density-0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_ties-density-0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
johnsutor/Llama-3-8B-Instruct_ties-density-0.1
84793f89ebe3be5b5bd9a797d4bbdf374c07419d
20.428512
apache-2.0
0
8
true
false
false
false
0.786361
0.411612
41.16123
0.502145
28.768719
0.079305
7.930514
0.288591
5.145414
0.417375
10.671875
0.36004
28.893322
true
false
2024-06-07
2024-06-26
1
johnsutor/Llama-3-8B-Instruct_ties-density-0.1 (Merge)
johnsutor_Llama-3-8B-Instruct_ties-density-0.3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_ties-density-0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_ties-density-0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_ties-density-0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
johnsutor/Llama-3-8B-Instruct_ties-density-0.3
8d051f3eec3fc93a4521073c2d290c4ff9144fc1
18.842382
apache-2.0
0
8
true
false
false
false
0.957335
0.362628
36.262783
0.490611
27.724507
0.066465
6.646526
0.296141
6.152125
0.40249
10.477865
0.332114
25.790485
true
false
2024-06-07
2024-06-26
1
johnsutor/Llama-3-8B-Instruct_ties-density-0.3 (Merge)
johnsutor_Llama-3-8B-Instruct_ties-density-0.5_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_ties-density-0.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_ties-density-0.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_ties-density-0.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
johnsutor/Llama-3-8B-Instruct_ties-density-0.5
c857e33c30016960f114e3a049f5dae41d68bfe7
18.209008
apache-2.0
0
8
true
false
false
false
0.841933
0.379664
37.966374
0.479312
26.012097
0.060423
6.042296
0.30453
7.270694
0.387979
7.797396
0.317487
24.165189
true
false
2024-06-07
2024-06-26
1
johnsutor/Llama-3-8B-Instruct_ties-density-0.5 (Merge)
johnsutor_Llama-3-8B-Instruct_ties-density-0.7_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_ties-density-0.7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_ties-density-0.7</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_ties-density-0.7-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
johnsutor/Llama-3-8B-Instruct_ties-density-0.7
8d7d8bbb1e8cba5e51337f97bc3d6d8ae40544d5
18.00619
apache-2.0
0
8
true
false
false
false
0.898051
0.368123
36.812325
0.473819
25.371408
0.064199
6.41994
0.309564
7.941834
0.388073
7.575781
0.315243
23.915854
true
false
2024-06-07
2024-06-26
1
johnsutor/Llama-3-8B-Instruct_ties-density-0.7 (Merge)
johnsutor_Llama-3-8B-Instruct_ties-density-0.9_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_ties-density-0.9" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_ties-density-0.9</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_ties-density-0.9-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
johnsutor/Llama-3-8B-Instruct_ties-density-0.9
57c280ce43fe81a23c966b48de6db7f4a85383a3
18.135851
apache-2.0
0
8
true
false
false
false
0.901058
0.385809
38.580854
0.473543
25.463735
0.061934
6.193353
0.299497
6.599553
0.388042
7.738542
0.318152
24.239066
true
false
2024-06-07
2024-06-26
1
johnsutor/Llama-3-8B-Instruct_ties-density-0.9 (Merge)
jpacifico_Chocolatine-14B-Instruct-4k-DPO_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/jpacifico/Chocolatine-14B-Instruct-4k-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">jpacifico/Chocolatine-14B-Instruct-4k-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/jpacifico__Chocolatine-14B-Instruct-4k-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
jpacifico/Chocolatine-14B-Instruct-4k-DPO
30677e58010979af26b70240846fdf7ff38cbbf2
30.026894
mit
1
13
true
false
false
false
8.248901
0.468865
46.886483
0.629958
48.020722
0.160876
16.087613
0.341443
12.192394
0.443885
15.152344
0.476396
41.821809
false
false
2024-08-01
2024-08-08
0
jpacifico/Chocolatine-14B-Instruct-4k-DPO
jpacifico_Chocolatine-14B-Instruct-DPO-v1.2_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/jpacifico/Chocolatine-14B-Instruct-DPO-v1.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">jpacifico/Chocolatine-14B-Instruct-DPO-v1.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/jpacifico__Chocolatine-14B-Instruct-DPO-v1.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
jpacifico/Chocolatine-14B-Instruct-DPO-v1.2
d34bbd55b48e553f28579d86f3ccae19726c6b39
33.544049
mit
14
13
true
false
false
true
1.540603
0.685211
68.52108
0.643841
49.845064
0.194109
19.410876
0.325503
10.067114
0.426771
12.346354
0.469664
41.073803
false
false
2024-08-12
2024-08-28
0
jpacifico/Chocolatine-14B-Instruct-DPO-v1.2
jpacifico_Chocolatine-3B-Instruct-DPO-Revised_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/jpacifico/Chocolatine-3B-Instruct-DPO-Revised" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">jpacifico/Chocolatine-3B-Instruct-DPO-Revised</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/jpacifico__Chocolatine-3B-Instruct-DPO-Revised-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
jpacifico/Chocolatine-3B-Instruct-DPO-Revised
c403df6c0f78148cfb477972455cbd859149311a
27.911928
mit
26
3
true
false
false
true
0.754724
0.562263
56.226257
0.553998
37.155286
0.161631
16.163142
0.322148
9.619687
0.445344
15.101302
0.398853
33.205895
false
false
2024-07-17
2024-07-19
0
jpacifico/Chocolatine-3B-Instruct-DPO-Revised
jpacifico_Chocolatine-3B-Instruct-DPO-v1.0_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/jpacifico/Chocolatine-3B-Instruct-DPO-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">jpacifico/Chocolatine-3B-Instruct-DPO-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/jpacifico__Chocolatine-3B-Instruct-DPO-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
jpacifico/Chocolatine-3B-Instruct-DPO-v1.0
98d049b8f8c305cfba81adae498a95e6b5647d4a
25.203004
apache-2.0
3
3
true
false
false
false
0.799223
0.373718
37.37184
0.54714
36.55452
0.164653
16.465257
0.315436
8.724832
0.475479
19.468229
0.3937
32.633348
false
false
2024-07-11
2024-07-11
0
jpacifico/Chocolatine-3B-Instruct-DPO-v1.0
jpacifico_Chocolatine-3B-Instruct-DPO-v1.2_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/jpacifico/Chocolatine-3B-Instruct-DPO-v1.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">jpacifico/Chocolatine-3B-Instruct-DPO-v1.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/jpacifico__Chocolatine-3B-Instruct-DPO-v1.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
jpacifico/Chocolatine-3B-Instruct-DPO-v1.2
ebc9de6c266586adb1ec0db31bf050d1cd8fdffe
26.804511
mit
8
3
true
false
false
true
0.974445
0.545501
54.550149
0.548718
35.999388
0.141239
14.123867
0.338926
11.856823
0.415427
12.328385
0.387716
31.968454
false
false
2024-08-22
2024-08-28
0
jpacifico/Chocolatine-3B-Instruct-DPO-v1.2
jsfs11_MixtureofMerges-MoE-4x7b-v4_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/jsfs11/MixtureofMerges-MoE-4x7b-v4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">jsfs11/MixtureofMerges-MoE-4x7b-v4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/jsfs11__MixtureofMerges-MoE-4x7b-v4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
jsfs11/MixtureofMerges-MoE-4x7b-v4
2b98406f20a874184dbffb5ed24e1f4b5063ec4b
20.047537
apache-2.0
4
24
true
true
false
false
1.383828
0.402994
40.299406
0.516901
32.217998
0.064955
6.495468
0.286074
4.809843
0.438552
13.885677
0.303191
22.576832
true
false
2024-02-11
2024-08-05
1
jsfs11/MixtureofMerges-MoE-4x7b-v4 (Merge)
jsfs11_MixtureofMerges-MoE-4x7b-v5_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/jsfs11/MixtureofMerges-MoE-4x7b-v5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">jsfs11/MixtureofMerges-MoE-4x7b-v5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/jsfs11__MixtureofMerges-MoE-4x7b-v5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
jsfs11/MixtureofMerges-MoE-4x7b-v5
c1b5ce7144b966062df7627d2482a59e0df3757c
20.447529
apache-2.0
1
24
true
true
false
false
1.431272
0.41993
41.993023
0.519848
32.826724
0.076284
7.628399
0.284396
4.58613
0.43049
12.344531
0.309757
23.306368
true
false
2024-02-25
2024-08-05
1
jsfs11/MixtureofMerges-MoE-4x7b-v5 (Merge)
kaist-ai_janus-7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/kaist-ai/janus-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">kaist-ai/janus-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/kaist-ai__janus-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
kaist-ai/janus-7b
f19c614ae7c81db06af1655d297c67afa99ad286
17.654763
apache-2.0
8
7
true
false
false
false
0.606603
0.377515
37.751499
0.469367
25.74987
0.043051
4.305136
0.272651
3.020134
0.440104
14.279688
0.2874
20.822252
false
false
2024-04-04
2024-10-09
1
alpindale/Mistral-7B-v0.2-hf
kaist-ai_janus-dpo-7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/kaist-ai/janus-dpo-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">kaist-ai/janus-dpo-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/kaist-ai__janus-dpo-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
kaist-ai/janus-dpo-7b
a414396b6d03fba75d12ccf7d8391186b4b639ce
18.531649
apache-2.0
2
7
true
false
false
false
0.626428
0.400271
40.027128
0.477258
27.090902
0.041541
4.154079
0.281879
4.250559
0.43874
13.709115
0.297623
21.958112
false
false
2024-04-25
2024-10-09
1
Removed
kaist-ai_janus-rm-7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LLMForSequenceRegression
<a target="_blank" href="https://huggingface.co/kaist-ai/janus-rm-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">kaist-ai/janus-rm-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/kaist-ai__janus-rm-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
kaist-ai/janus-rm-7b
ffdbcc353ad4034fdfa68a767d265920d5f3e71c
4.775599
apache-2.0
4
7
true
false
false
false
0.539111
0.177805
17.780489
0.305647
3.277781
0
0
0.251678
0.223714
0.388292
5.969792
0.112616
1.401817
false
false
2024-05-09
2024-10-09
0
kaist-ai/janus-rm-7b
kaist-ai_mistral-orpo-capybara-7k_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/kaist-ai/mistral-orpo-capybara-7k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">kaist-ai/mistral-orpo-capybara-7k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/kaist-ai__mistral-orpo-capybara-7k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
kaist-ai/mistral-orpo-capybara-7k
24c1172060658a1923c9b454796857e2cc59fbeb
19.18313
mit
26
7
true
false
false
true
0.660744
0.536734
53.673364
0.4489
23.434359
0.037009
3.700906
0.286074
4.809843
0.396354
7.577604
0.297124
21.902704
false
false
2024-03-23
2024-10-09
1
kaist-ai/mistral-orpo-capybara-7k (Merge)
keeeeenw_MicroLlama_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/keeeeenw/MicroLlama" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">keeeeenw/MicroLlama</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/keeeeenw__MicroLlama-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
keeeeenw/MicroLlama
8d5874ca07b86ea1ea2e71eea96212278506ba65
5.077267
apache-2.0
39
0
true
false
false
false
0.185768
0.198538
19.853766
0.300731
2.831364
0
0
0.260906
1.454139
0.369812
4.793229
0.11378
1.531102
false
false
2024-03-29
2024-09-15
0
keeeeenw/MicroLlama
kekmodel_StopCarbon-10.7B-v5_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/kekmodel/StopCarbon-10.7B-v5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">kekmodel/StopCarbon-10.7B-v5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/kekmodel__StopCarbon-10.7B-v5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
kekmodel/StopCarbon-10.7B-v5
7d59819dce2439f6c83b4f5c21a68aa882ff5ac9
20.001472
cc-by-nc-sa-4.0
2
10
true
false
false
true
0.745294
0.472837
47.283652
0.517772
31.993222
0
0
0.306208
7.494407
0.401938
9.275521
0.315658
23.962027
true
false
2023-12-30
2024-07-25
0
kekmodel/StopCarbon-10.7B-v5
kevin009_llamaRAGdrama_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/kevin009/llamaRAGdrama" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">kevin009/llamaRAGdrama</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/kevin009__llamaRAGdrama-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
kevin009/llamaRAGdrama
8c103ca8fa6dd9a8d3dab81b319408095e9a1ad8
13.222836
apache-2.0
7
7
true
false
false
true
0.639639
0.259837
25.983723
0.400739
16.637814
0.035498
3.549849
0.264262
1.901566
0.431573
12.113281
0.272357
19.150783
false
false
2024-02-04
2024-06-26
0
kevin009/llamaRAGdrama
kms7530_chemeng_llama-3-8b-Instruct-bnb-4bit_24_1_100_1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/kms7530/chemeng_llama-3-8b-Instruct-bnb-4bit_24_1_100_1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">kms7530/chemeng_llama-3-8b-Instruct-bnb-4bit_24_1_100_1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/kms7530__chemeng_llama-3-8b-Instruct-bnb-4bit_24_1_100_1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
kms7530/chemeng_llama-3-8b-Instruct-bnb-4bit_24_1_100_1
f296897830363557c84cc4a942c2cd1f91818ae4
17.556133
apache-2.0
0
9
true
false
false
true
2.299978
0.545501
54.550149
0.428904
19.07919
0.035498
3.549849
0.270134
2.684564
0.382062
5.491146
0.279837
19.9819
false
false
2024-10-10
2024-10-14
2
Removed
kms7530_chemeng_phi-3-mini-4k-instruct-bnb-4bit_16_4_100_1_nonmath_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/kms7530/chemeng_phi-3-mini-4k-instruct-bnb-4bit_16_4_100_1_nonmath" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">kms7530/chemeng_phi-3-mini-4k-instruct-bnb-4bit_16_4_100_1_nonmath</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/kms7530__chemeng_phi-3-mini-4k-instruct-bnb-4bit_16_4_100_1_nonmath-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
kms7530/chemeng_phi-3-mini-4k-instruct-bnb-4bit_16_4_100_1_nonmath
81453e5718775630581ab9950e6c0ccf0d7a4177
21.984419
apache-2.0
1
4
true
false
false
true
1.356985
0.486325
48.632517
0.498718
29.259631
0.100453
10.045317
0.310403
8.053691
0.398281
8.351823
0.348072
27.563534
false
false
2024-11-23
2024-11-25
1
unsloth/Phi-3-mini-4k-instruct-bnb-4bit
kms7530_chemeng_qwen-math-7b_24_1_100_1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/kms7530/chemeng_qwen-math-7b_24_1_100_1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">kms7530/chemeng_qwen-math-7b_24_1_100_1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/kms7530__chemeng_qwen-math-7b_24_1_100_1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
kms7530/chemeng_qwen-math-7b_24_1_100_1
b3c1a1875fe4679e8c402b2bde02ae6c1127eb63
12.972013
apache-2.0
0
8
true
false
false
true
4.62359
0.179753
17.975306
0.404001
16.463149
0.148036
14.803625
0.284396
4.58613
0.388573
6.238281
0.25989
17.765588
false
false
2024-10-10
2024-10-14
4
Qwen/Qwen2.5-7B
kms7530_chemeng_qwen-math-7b_24_1_100_1_nonmath_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Adapter
?
<a target="_blank" href="https://huggingface.co/kms7530/chemeng_qwen-math-7b_24_1_100_1_nonmath" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">kms7530/chemeng_qwen-math-7b_24_1_100_1_nonmath</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/kms7530__chemeng_qwen-math-7b_24_1_100_1_nonmath-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
kms7530/chemeng_qwen-math-7b_24_1_100_1_nonmath
ef9926d75ab1d54532f6a30dd5e760355eb9aa4d
16.591858
apache-2.0
0
15
true
false
false
true
1.279031
0.258363
25.836336
0.389286
14.135345
0.286254
28.625378
0.290268
5.369128
0.408698
9.453906
0.24518
16.131058
false
false
2024-11-21
2024-11-22
4
Qwen/Qwen2.5-7B
kno10_ende-chat-0.0.5_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/kno10/ende-chat-0.0.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">kno10/ende-chat-0.0.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/kno10__ende-chat-0.0.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
kno10/ende-chat-0.0.5
fff913e8ce204bab72b02582b663db669cb61412
10.636087
apache-2.0
0
7
true
false
false
true
1.974966
0.340446
34.044557
0.360437
11.125831
0.007553
0.755287
0.265101
2.013423
0.393844
7.097135
0.179023
8.78029
false
false
2024-06-27
2024-06-27
0
kno10/ende-chat-0.0.5
kno10_ende-chat-0.0.7_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/kno10/ende-chat-0.0.7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">kno10/ende-chat-0.0.7</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/kno10__ende-chat-0.0.7-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
kno10/ende-chat-0.0.7
1d45f51e5a3387378cea1036b0c65f2893466dd6
13.082387
apache-2.0
0
7
true
false
false
true
0.958366
0.440063
44.006348
0.379187
13.578949
0
0
0.28104
4.138702
0.386125
6.032292
0.196642
10.738032
false
false
2024-07-30
2024-07-30
0
kno10/ende-chat-0.0.7
ladydaina_ECE-FDF_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/ladydaina/ECE-FDF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ladydaina/ECE-FDF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ladydaina__ECE-FDF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ladydaina/ECE-FDF
81e709d727e9ba5cf8707fe0c5c08e688a4cc6bd
20.004601
0
7
false
false
false
false
0.446224
0.372844
37.284405
0.515018
32.250998
0.079305
7.930514
0.282718
4.362416
0.450396
15.899479
0.300698
22.299793
false
false
2024-11-14
2024-11-14
1
ladydaina/ECE-FDF (Merge)
laislemke_LLaMA-2-vicuna-7b-slerp_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/laislemke/LLaMA-2-vicuna-7b-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">laislemke/LLaMA-2-vicuna-7b-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/laislemke__LLaMA-2-vicuna-7b-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
laislemke/LLaMA-2-vicuna-7b-slerp
84a64f0ac8ff7db632a9d012fd5f4dcdf1eff950
7.669226
llama2
0
6
true
false
false
true
0.597076
0.29321
29.320979
0.298622
2.598264
0.009819
0.981873
0.27349
3.131991
0.383302
6.179427
0.134225
3.802822
true
false
2024-07-03
2024-07-03
1
laislemke/LLaMA-2-vicuna-7b-slerp (Merge)
lalainy_ECE-PRYMMAL-0.5B-FT-V5-MUSR_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/lalainy/ECE-PRYMMAL-0.5B-FT-V5-MUSR" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lalainy/ECE-PRYMMAL-0.5B-FT-V5-MUSR</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lalainy__ECE-PRYMMAL-0.5B-FT-V5-MUSR-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lalainy/ECE-PRYMMAL-0.5B-FT-V5-MUSR
bf80bf3d14a79b5dcb322b97b6dbaf10e316a3ee
6.529138
apache-2.0
0
0
true
false
false
false
0.58294
0.213775
21.377501
0.326944
6.485922
0.013595
1.359517
0.274329
3.243848
0.32625
0.78125
0.153341
5.926788
false
false
2024-10-22
2024-10-22
0
lalainy/ECE-PRYMMAL-0.5B-FT-V5-MUSR
lalainy_ECE-PRYMMAL-0.5B-SLERP-V4_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/lalainy/ECE-PRYMMAL-0.5B-SLERP-V4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lalainy/ECE-PRYMMAL-0.5B-SLERP-V4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lalainy__ECE-PRYMMAL-0.5B-SLERP-V4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lalainy/ECE-PRYMMAL-0.5B-SLERP-V4
3a34c33dba0f02cd8c5172f45b6f6510cad1563d
4.380943
apache-2.0
0
0
true
false
false
false
1.267973
0.156397
15.639725
0.289431
2.09608
0
0
0.262584
1.677852
0.378927
4.999219
0.116855
1.872784
false
false
2024-10-22
2024-10-22
0
lalainy/ECE-PRYMMAL-0.5B-SLERP-V4