eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
7 values
T
stringclasses
7 values
Weight type
stringclasses
2 values
Architecture
stringclasses
64 values
Model
stringlengths
355
689
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
0.74
52.1
Hub License
stringclasses
27 values
Hub ❤️
int64
0
6.09k
#Params (B)
float64
-1
141
Available on the hub
bool
2 classes
MoE
bool
2 classes
Flagged
bool
2 classes
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.04
187
IFEval Raw
float64
0
0.9
IFEval
float64
0
90
BBH Raw
float64
0.22
0.83
BBH
float64
0.25
76.7
MATH Lvl 5 Raw
float64
0
0.71
MATH Lvl 5
float64
0
71.5
GPQA Raw
float64
0.21
0.47
GPQA
float64
0
29.4
MUSR Raw
float64
0.29
0.6
MUSR
float64
0
38.7
MMLU-PRO Raw
float64
0.1
0.73
MMLU-PRO
float64
0
70
Merged
bool
2 classes
Official Providers
bool
2 classes
Upload To Hub Date
stringclasses
525 values
Submission Date
stringclasses
263 values
Generation
int64
0
10
Base Model
stringlengths
4
102
lesubra_ECE-PRYMMAL-3B-SLERP_2-V2_float16
float16
🤝 base merges and moerges
🤝
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/lesubra/ECE-PRYMMAL-3B-SLERP_2-V2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lesubra/ECE-PRYMMAL-3B-SLERP_2-V2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lesubra__ECE-PRYMMAL-3B-SLERP_2-V2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lesubra/ECE-PRYMMAL-3B-SLERP_2-V2
d5074a951206f946a6be331a74bd4fa381d348eb
24.98682
apache-2.0
0
3.821
true
false
false
false
1.071034
0.366424
36.642442
0.541145
35.710681
0.167674
16.767372
0.321309
9.50783
0.466146
18.068229
0.399019
33.224365
true
false
2024-11-06
2024-11-06
0
lesubra/ECE-PRYMMAL-3B-SLERP_2-V2
lesubra_merge-test_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/lesubra/merge-test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lesubra/merge-test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lesubra__merge-test-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lesubra/merge-test
39895c64dd646443719873a2ab2b19d3afe4f86c
26.075521
apache-2.0
0
3.821
true
false
false
true
1.937051
0.538257
53.825738
0.524043
33.353311
0.120846
12.084592
0.322148
9.619687
0.441906
15.638281
0.387384
31.931516
true
false
2024-09-27
2024-09-27
0
lesubra/merge-test
lightblue_suzume-llama-3-8B-multilingual_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lightblue/suzume-llama-3-8B-multilingual" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lightblue/suzume-llama-3-8B-multilingual</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lightblue__suzume-llama-3-8B-multilingual-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lightblue/suzume-llama-3-8B-multilingual
0cb15aa9ec685eef494f9a15f65aefcfe3c04c66
23.986306
other
110
8.03
true
false
false
true
1.681979
0.6678
66.780033
0.494995
28.895092
0.094411
9.441088
0.283557
4.474273
0.397687
7.844271
0.338348
26.483082
false
false
2024-04-23
2024-07-30
1
meta-llama/Meta-Llama-3-8B-Instruct
lightblue_suzume-llama-3-8B-multilingual-orpo-borda-full_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lightblue/suzume-llama-3-8B-multilingual-orpo-borda-full" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lightblue/suzume-llama-3-8B-multilingual-orpo-borda-full</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lightblue__suzume-llama-3-8B-multilingual-orpo-borda-full-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lightblue/suzume-llama-3-8B-multilingual-orpo-borda-full
ac04e23fb8861c188f8ecddfecc4250b40aee04d
20.301708
cc-by-nc-4.0
2
8.03
true
false
false
true
1.605133
0.581746
58.174643
0.471422
25.075475
0.076284
7.628399
0.259228
1.230425
0.322188
4.040104
0.330951
25.6612
false
false
2024-04-25
2024-07-29
2
meta-llama/Meta-Llama-3-8B-Instruct
lightblue_suzume-llama-3-8B-multilingual-orpo-borda-half_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lightblue/suzume-llama-3-8B-multilingual-orpo-borda-half" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lightblue/suzume-llama-3-8B-multilingual-orpo-borda-half</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lightblue__suzume-llama-3-8B-multilingual-orpo-borda-half-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lightblue/suzume-llama-3-8B-multilingual-orpo-borda-half
b82150a9840ba5ba93918c745adc70afc6ad2ce1
21.509797
cc-by-nc-4.0
16
8.03
true
false
false
true
1.772674
0.624911
62.491079
0.470746
26.348598
0.090634
9.063444
0.244966
0
0.351583
2.114583
0.36137
29.041076
false
false
2024-04-25
2024-06-29
2
meta-llama/Meta-Llama-3-8B-Instruct
lightblue_suzume-llama-3-8B-multilingual-orpo-borda-top25_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lightblue/suzume-llama-3-8B-multilingual-orpo-borda-top25" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lightblue/suzume-llama-3-8B-multilingual-orpo-borda-top25</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lightblue__suzume-llama-3-8B-multilingual-orpo-borda-top25-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lightblue/suzume-llama-3-8B-multilingual-orpo-borda-top25
5a2f17238cc83932e00613d285f8bf6b8f4a0c3a
23.684768
cc-by-nc-4.0
3
8.03
true
false
false
true
1.669737
0.663654
66.365355
0.486464
27.665285
0.10423
10.422961
0.272651
3.020134
0.356604
4.808854
0.368434
29.82602
false
false
2024-04-26
2024-06-29
2
meta-llama/Meta-Llama-3-8B-Instruct
lightblue_suzume-llama-3-8B-multilingual-orpo-borda-top75_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lightblue/suzume-llama-3-8B-multilingual-orpo-borda-top75" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lightblue/suzume-llama-3-8B-multilingual-orpo-borda-top75</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lightblue__suzume-llama-3-8B-multilingual-orpo-borda-top75-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lightblue/suzume-llama-3-8B-multilingual-orpo-borda-top75
555f4a0092f239557e1aa34f9d489e8156b907bb
23.64712
cc-by-nc-4.0
3
8.03
true
false
false
true
1.893009
0.668725
66.872454
0.483332
28.056256
0.07855
7.854985
0.272651
3.020134
0.381688
5.310938
0.376912
30.767952
false
false
2024-04-26
2024-06-29
2
meta-llama/Meta-Llama-3-8B-Instruct
lkoenig_BBAI_145__bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/lkoenig/BBAI_145_" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lkoenig/BBAI_145_</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lkoenig__BBAI_145_-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lkoenig/BBAI_145_
99e3e08fd5154b863b41d07b88fc8c67f4bab0ea
29.670567
0
7.616
false
false
false
false
0.84293
0.445035
44.503473
0.556717
36.730103
0.361027
36.102719
0.316275
8.836689
0.438208
13.076042
0.448969
38.774379
false
false
2025-02-26
2025-02-26
1
lkoenig/BBAI_145_ (Merge)
lkoenig_BBAI_200_Gemma_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/lkoenig/BBAI_200_Gemma" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lkoenig/BBAI_200_Gemma</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lkoenig__BBAI_200_Gemma-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lkoenig/BBAI_200_Gemma
ebb82acd5ce0a2c906d730d229db0260190f6056
5.086148
0
19.3
false
false
false
false
3.248205
0.070517
7.051734
0.344904
9.395846
0
0
0.266779
2.237136
0.363115
4.289323
0.167886
7.542849
false
false
2025-02-26
2025-02-26
1
lkoenig/BBAI_200_Gemma (Merge)
lkoenig_BBAI_212_QwenLawLo_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/lkoenig/BBAI_212_QwenLawLo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lkoenig/BBAI_212_QwenLawLo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lkoenig__BBAI_212_QwenLawLo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lkoenig/BBAI_212_QwenLawLo
e4229aee1152cd8f3923528d0d1e7480a78cc798
29.879847
0
7.616
false
false
false
false
0.681575
0.456625
45.662509
0.557411
36.933124
0.360272
36.02719
0.316275
8.836689
0.436969
13.054427
0.448886
38.765145
false
false
2025-03-01
2025-03-01
1
lkoenig/BBAI_212_QwenLawLo (Merge)
lkoenig_BBAI_212_Qwencore_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/lkoenig/BBAI_212_Qwencore" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lkoenig/BBAI_212_Qwencore</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lkoenig__BBAI_212_Qwencore-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lkoenig/BBAI_212_Qwencore
032f274a2f3fb7a4e1cd6f876d7e3fbe557d7027
29.286401
0
7.613
false
false
false
false
0.69663
0.43844
43.844001
0.556868
36.744079
0.348943
34.89426
0.316275
8.836689
0.434333
12.625
0.448969
38.774379
false
false
2025-02-27
2025-02-27
1
lkoenig/BBAI_212_Qwencore (Merge)
lkoenig_BBAI_230_Xiaqwen_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/lkoenig/BBAI_230_Xiaqwen" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lkoenig/BBAI_230_Xiaqwen</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lkoenig__BBAI_230_Xiaqwen-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lkoenig/BBAI_230_Xiaqwen
54ab14c2731a9f4cae610407b83b59c82bdf761a
30.155386
2
7.616
false
false
false
false
0.727873
0.464893
46.489315
0.55778
36.828288
0.366314
36.63142
0.313758
8.501119
0.442208
13.809375
0.448055
38.672798
false
false
2025-02-26
2025-02-26
1
lkoenig/BBAI_230_Xiaqwen (Merge)
lkoenig_BBAI_375_QwenDyancabs_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/lkoenig/BBAI_375_QwenDyancabs" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lkoenig/BBAI_375_QwenDyancabs</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lkoenig__BBAI_375_QwenDyancabs-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lkoenig/BBAI_375_QwenDyancabs
2a21d76baa1a59605d7b5df0ff091efd7452a001
30.232291
0
7.616
false
false
false
false
0.656472
0.456575
45.657522
0.557138
36.717182
0.377644
37.76435
0.312919
8.389262
0.446177
14.238802
0.44764
38.626625
false
false
2025-02-27
2025-02-27
1
lkoenig/BBAI_375_QwenDyancabs (Merge)
lkoenig_BBAI_456_QwenKoen_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/lkoenig/BBAI_456_QwenKoen" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lkoenig/BBAI_456_QwenKoen</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lkoenig__BBAI_456_QwenKoen-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lkoenig/BBAI_456_QwenKoen
a56aa459673eb7d685ee663b51371bc84b67c814
29.811925
0
7.616
false
false
false
false
0.666718
0.452928
45.292823
0.555271
36.549146
0.36858
36.858006
0.312919
8.389262
0.43951
13.238802
0.446892
38.543514
false
false
2025-03-01
2025-03-01
1
lkoenig/BBAI_456_QwenKoen (Merge)
lkoenig_BBAI_7B_KoenQwenDyan_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/lkoenig/BBAI_7B_KoenQwenDyan" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lkoenig/BBAI_7B_KoenQwenDyan</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lkoenig__BBAI_7B_KoenQwenDyan-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lkoenig/BBAI_7B_KoenQwenDyan
6638a1360874843766f32c576f6cad02536fb1c8
32.024954
0
7.616
false
false
false
false
0.67276
0.580722
58.072248
0.553657
36.245723
0.373867
37.386707
0.317953
9.060403
0.436875
12.942708
0.445977
38.441933
false
false
2025-03-02
2025-03-08
1
lkoenig/BBAI_7B_KoenQwenDyan (Merge)
lkoenig_BBAI_7B_Qwen2.5koen_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/lkoenig/BBAI_7B_Qwen2.5koen" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lkoenig/BBAI_7B_Qwen2.5koen</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lkoenig__BBAI_7B_Qwen2.5koen-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lkoenig/BBAI_7B_Qwen2.5koen
fb6914a9b1b8234a73920be9ae8ed935bda35f4b
29.836312
0
7.616
false
false
false
false
0.66077
0.459997
45.999725
0.554403
36.307408
0.365559
36.555891
0.312919
8.389262
0.436906
13.046615
0.448471
38.718972
false
false
2025-03-01
2025-03-01
1
lkoenig/BBAI_7B_Qwen2.5koen (Merge)
lkoenig_BBAI_7B_QwenDyanKoenLo_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/lkoenig/BBAI_7B_QwenDyanKoenLo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lkoenig/BBAI_7B_QwenDyanKoenLo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lkoenig__BBAI_7B_QwenDyanKoenLo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lkoenig/BBAI_7B_QwenDyanKoenLo
ce3449d15a540fefdcf6c64ed87176fa45450e1b
30.017582
0
7.616
false
false
false
false
0.675451
0.466317
46.631715
0.556246
36.678248
0.364048
36.404834
0.318792
9.17226
0.434302
12.721094
0.446476
38.49734
false
false
2025-03-02
2025-03-08
1
lkoenig/BBAI_7B_QwenDyanKoenLo (Merge)
lkoenig_BBAI_7B_QwenDyancabsLAW_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/lkoenig/BBAI_7B_QwenDyancabsLAW" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lkoenig/BBAI_7B_QwenDyancabsLAW</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lkoenig__BBAI_7B_QwenDyancabsLAW-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lkoenig/BBAI_7B_QwenDyancabsLAW
343b0c0d3d92386e9d3756f37bb0b27a4479a1ce
31.816744
0
7.616
false
false
false
false
0.655215
0.554969
55.496859
0.557884
36.779994
0.367825
36.782477
0.318792
9.17226
0.446115
14.097656
0.447141
38.571217
false
false
2025-03-01
2025-03-01
1
lkoenig/BBAI_7B_QwenDyancabsLAW (Merge)
llmat_Mistral-v0.3-7B-ORPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/llmat/Mistral-v0.3-7B-ORPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">llmat/Mistral-v0.3-7B-ORPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/llmat__Mistral-v0.3-7B-ORPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
llmat/Mistral-v0.3-7B-ORPO
868d8a51e8deb6fd948eabe5bc296c53bcf41073
12.39929
apache-2.0
1
7.248
true
false
false
true
1.898061
0.377041
37.70407
0.397766
14.863159
0.024169
2.416918
0.266779
2.237136
0.355521
2.973438
0.227809
14.20102
false
false
2024-08-04
2024-09-02
2
mistralai/Mistral-7B-v0.3
llmat_Mistral-v0.3-7B-ORPO_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/llmat/Mistral-v0.3-7B-ORPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">llmat/Mistral-v0.3-7B-ORPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/llmat__Mistral-v0.3-7B-ORPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
llmat/Mistral-v0.3-7B-ORPO
868d8a51e8deb6fd948eabe5bc296c53bcf41073
12.024322
apache-2.0
1
7.248
true
false
false
true
0.626818
0.363976
36.397647
0.400466
15.591491
0.001511
0.151057
0.269295
2.572707
0.352854
2.973438
0.230136
14.459589
false
false
2024-08-04
2024-08-06
2
mistralai/Mistral-7B-v0.3
llnYou_ECE-PRYMMAL-YL-1B-SLERP-V5_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/llnYou/ECE-PRYMMAL-YL-1B-SLERP-V5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">llnYou/ECE-PRYMMAL-YL-1B-SLERP-V5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/llnYou__ECE-PRYMMAL-YL-1B-SLERP-V5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
llnYou/ECE-PRYMMAL-YL-1B-SLERP-V5
6facb36cea2f670e32d6571846f00aa4cf5aaa86
15.836336
apache-2.0
0
1.544
true
false
false
false
1.280357
0.331253
33.12533
0.423295
18.879659
0.111027
11.102719
0.286074
4.809843
0.386802
5.65026
0.293052
21.450207
false
false
2024-11-12
2024-11-12
0
llnYou/ECE-PRYMMAL-YL-1B-SLERP-V5
llnYou_ECE-PRYMMAL-YL-1B-SLERP-V6_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/llnYou/ECE-PRYMMAL-YL-1B-SLERP-V6" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">llnYou/ECE-PRYMMAL-YL-1B-SLERP-V6</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/llnYou__ECE-PRYMMAL-YL-1B-SLERP-V6-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
llnYou/ECE-PRYMMAL-YL-1B-SLERP-V6
f15fb39e40475348e7d349c3ec2f346ffca39377
9.395274
apache-2.0
0
1.357
true
false
false
false
1.110829
0.138762
13.876182
0.394403
14.538923
0.002266
0.226586
0.290268
5.369128
0.392792
7.365625
0.234957
14.995198
false
false
2024-11-13
2024-11-13
0
llnYou/ECE-PRYMMAL-YL-1B-SLERP-V6
llnYou_ECE-PRYMMAL-YL-3B-SLERP-V1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/llnYou/ECE-PRYMMAL-YL-3B-SLERP-V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">llnYou/ECE-PRYMMAL-YL-3B-SLERP-V1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/llnYou__ECE-PRYMMAL-YL-3B-SLERP-V1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
llnYou/ECE-PRYMMAL-YL-3B-SLERP-V1
4918220543f4923137d20204a5ea396f65f6b956
11.626794
apache-2.0
0
2.81
true
false
false
false
1.158777
0.234633
23.4633
0.401842
15.797462
0.009063
0.906344
0.293624
5.816555
0.336448
3.222656
0.28499
20.554447
false
false
2024-11-12
2024-11-13
0
llnYou/ECE-PRYMMAL-YL-3B-SLERP-V1
llnYou_ECE-PRYMMAL-YL-3B-SLERP-V2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/llnYou/ECE-PRYMMAL-YL-3B-SLERP-V2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">llnYou/ECE-PRYMMAL-YL-3B-SLERP-V2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/llnYou__ECE-PRYMMAL-YL-3B-SLERP-V2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
llnYou/ECE-PRYMMAL-YL-3B-SLERP-V2
c3d4fbef1a10ef2746c47c0379b4247c784758e5
11.813468
apache-2.0
0
2.81
true
false
false
false
1.087618
0.230936
23.093614
0.398977
15.202244
0.01284
1.283988
0.276846
3.579418
0.358771
6.613021
0.289977
21.108525
false
false
2024-11-12
2024-11-13
0
llnYou/ECE-PRYMMAL-YL-3B-SLERP-V2
llnYou_ECE-PRYMMAL-YL-3B-SLERP-V3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/llnYou/ECE-PRYMMAL-YL-3B-SLERP-V3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">llnYou/ECE-PRYMMAL-YL-3B-SLERP-V3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/llnYou__ECE-PRYMMAL-YL-3B-SLERP-V3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
llnYou/ECE-PRYMMAL-YL-3B-SLERP-V3
90648507743059de96334fdc97309b6f2af3d01d
23.426855
apache-2.0
0
3.821
true
false
false
false
1.086651
0.358081
35.8081
0.547312
36.625756
0.129909
12.990937
0.30453
7.270694
0.436135
14.05026
0.404338
33.815381
false
false
2024-11-13
2024-11-13
0
llnYou/ECE-PRYMMAL-YL-3B-SLERP-V3
lmsys_vicuna-13b-v1.3_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lmsys/vicuna-13b-v1.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lmsys/vicuna-13b-v1.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lmsys__vicuna-13b-v1.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lmsys/vicuna-13b-v1.3
6566e9cb1787585d1147dcf4f9bc48f29e1328d2
10.435534
198
13
true
false
false
true
2.188466
0.334351
33.435063
0.33844
7.489789
0.01435
1.435045
0.267617
2.348993
0.372729
4.091146
0.224318
13.813165
false
true
2023-06-18
2024-06-28
0
lmsys/vicuna-13b-v1.3
lmsys_vicuna-7b-v1.3_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lmsys/vicuna-7b-v1.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lmsys/vicuna-7b-v1.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lmsys__vicuna-7b-v1.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lmsys/vicuna-7b-v1.3
236eeeab96f0dc2e463f2bebb7bb49809279c6d6
8.525809
132
7
true
false
false
true
1.126756
0.290862
29.086158
0.329841
6.461379
0.01284
1.283988
0.24245
0
0.379333
5.016667
0.18376
9.306664
false
true
2023-06-18
2024-06-28
0
lmsys/vicuna-7b-v1.3
lmsys_vicuna-7b-v1.5_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lmsys/vicuna-7b-v1.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lmsys/vicuna-7b-v1.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lmsys__vicuna-7b-v1.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lmsys/vicuna-7b-v1.5
3321f76e3f527bd14065daf69dad9344000a201d
10.885152
llama2
335
7
true
false
false
false
1.205436
0.235157
23.515716
0.394704
15.152509
0.013595
1.359517
0.258389
1.118568
0.423115
11.422656
0.214678
12.741947
false
true
2023-07-29
2024-06-12
0
lmsys/vicuna-7b-v1.5
lodrick-the-lafted_llama-3.1-8b-instruct-ortho-v7_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lodrick-the-lafted/llama-3.1-8b-instruct-ortho-v7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lodrick-the-lafted/llama-3.1-8b-instruct-ortho-v7</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lodrick-the-lafted__llama-3.1-8b-instruct-ortho-v7-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lodrick-the-lafted/llama-3.1-8b-instruct-ortho-v7
6b7673cd78398c3a8c92f8e759aaae6409e96978
11.812819
wtfpl
0
8.03
true
false
false
false
1.863148
0.351462
35.14619
0.390691
14.437863
0.02719
2.719033
0.272651
3.020134
0.361594
4.732552
0.19739
10.821144
false
false
2024-07-25
2024-07-30
0
lodrick-the-lafted/llama-3.1-8b-instruct-ortho-v7
lordjia_Llama-3-Cantonese-8B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lordjia/Llama-3-Cantonese-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lordjia/Llama-3-Cantonese-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lordjia__Llama-3-Cantonese-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lordjia/Llama-3-Cantonese-8B-Instruct
ea98e9b1ab3ea0d66e5270816e43d7a70aaaa151
24.271709
llama3
6
8.03
true
false
false
true
1.535406
0.666926
66.692598
0.481415
26.791039
0.089124
8.912387
0.293624
5.816555
0.404604
9.475521
0.351479
27.942154
false
false
2024-07-16
2024-08-03
0
lordjia/Llama-3-Cantonese-8B-Instruct
lordjia_Qwen2-Cantonese-7B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/lordjia/Qwen2-Cantonese-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lordjia/Qwen2-Cantonese-7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lordjia__Qwen2-Cantonese-7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lordjia/Qwen2-Cantonese-7B-Instruct
eb8b0faee749d167fd70e74f5e579094c4cfe7fb
26.309196
apache-2.0
3
7.616
true
false
false
true
2.032014
0.543528
54.352784
0.521531
32.453217
0.256042
25.60423
0.295302
6.040268
0.400385
7.814844
0.384309
31.589835
false
false
2024-07-13
2024-08-03
0
lordjia/Qwen2-Cantonese-7B-Instruct
lt-asset_nova-1.3b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
NovaForCausalLM
<a target="_blank" href="https://huggingface.co/lt-asset/nova-1.3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lt-asset/nova-1.3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lt-asset__nova-1.3b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lt-asset/nova-1.3b
766eb459b5aa1e084b5474bb86ade09f9bed8fca
3.853651
bsd-3-clause-clear
4
1.347
true
false
false
false
0.495493
0.121426
12.14256
0.317001
4.43762
0.012085
1.208459
0.249161
0
0.369781
3.75599
0.114195
1.577275
false
false
2024-01-20
2024-11-16
0
lt-asset/nova-1.3b
lunahr_thea-3b-50r-u1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lunahr/thea-3b-50r-u1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lunahr/thea-3b-50r-u1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lunahr__thea-3b-50r-u1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lunahr/thea-3b-50r-u1
34371d851aa8c2f6fa2e05061a357196d8892d65
19.03709
llama3.2
0
3.213
true
false
false
true
1.72535
0.603029
60.302885
0.410467
16.222936
0.10423
10.422961
0.283557
4.474273
0.318188
2.706771
0.280834
20.092716
false
false
2025-01-11
2025-01-11
2
CreitinGameplays/Llama-3.2-3b-Instruct-uncensored-refinetune (Merge)
lunahr_thea-v2-3b-50r_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lunahr/thea-v2-3b-50r" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lunahr/thea-v2-3b-50r</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lunahr__thea-v2-3b-50r-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lunahr/thea-v2-3b-50r
b6c37e548658795006b2603dc500e6df01c674eb
12.953795
llama3.2
0
3.213
true
false
false
true
1.265276
0.370396
37.03961
0.419442
18.711908
0.024169
2.416918
0.260906
1.454139
0.322188
2.440104
0.240941
15.660092
false
false
2024-12-13
2024-12-14
3
Removed
m42-health_Llama3-Med42-70B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/m42-health/Llama3-Med42-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">m42-health/Llama3-Med42-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/m42-health__Llama3-Med42-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
m42-health/Llama3-Med42-70B
867064e18aad7bf3f4795f20dcb25a1108952543
35.683016
llama3
44
70.554
true
false
false
true
50.822469
0.629107
62.910743
0.668789
52.971348
0.225831
22.583082
0.347315
12.975391
0.462896
18.628646
0.49626
44.028886
false
false
2024-06-27
2024-12-11
0
m42-health/Llama3-Med42-70B
macadeliccc_Samantha-Qwen-2-7B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/macadeliccc/Samantha-Qwen-2-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">macadeliccc/Samantha-Qwen-2-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/macadeliccc__Samantha-Qwen-2-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
macadeliccc/Samantha-Qwen-2-7B
59058972fa9b56d132d04589eb17cbba277c2826
25.065086
apache-2.0
3
7.616
true
false
false
true
2.679615
0.437715
43.771526
0.508234
31.411894
0.21148
21.148036
0.272651
3.020134
0.479948
20.160156
0.377909
30.878768
false
false
2024-06-15
2024-08-05
1
Qwen/Qwen2-7B
macadeliccc_magistrate-3.2-3b-base_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/macadeliccc/magistrate-3.2-3b-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">macadeliccc/magistrate-3.2-3b-base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/macadeliccc__magistrate-3.2-3b-base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
macadeliccc/magistrate-3.2-3b-base
2a40ac9ca1904fca2c1e69573e27f0ff8039b738
6.046097
llama3.2
1
3.213
true
false
false
false
1.460686
0.11593
11.593018
0.33427
6.910281
0.011329
1.132931
0.260906
1.454139
0.397594
7.532552
0.168883
7.653664
false
false
2024-09-28
2024-10-01
1
meta-llama/Llama-3.2-3B
macadeliccc_magistrate-3.2-3b-it_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/macadeliccc/magistrate-3.2-3b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">macadeliccc/magistrate-3.2-3b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/macadeliccc__magistrate-3.2-3b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
macadeliccc/magistrate-3.2-3b-it
122961278c97195dd59d67b244907359013e4de5
7.088076
llama3.2
0
3.213
true
false
false
true
1.405902
0.229187
22.918744
0.325651
5.323155
0.019637
1.963746
0.247483
0
0.376323
5.740365
0.159242
6.582447
false
false
2024-10-01
2024-10-01
2
meta-llama/Llama-3.2-3B
magnifi_Phi3_intent_v56_3_w_unknown_5_lr_0.002_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/magnifi/Phi3_intent_v56_3_w_unknown_5_lr_0.002" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">magnifi/Phi3_intent_v56_3_w_unknown_5_lr_0.002</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/magnifi__Phi3_intent_v56_3_w_unknown_5_lr_0.002-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
magnifi/Phi3_intent_v56_3_w_unknown_5_lr_0.002
8a7bdc02074a472ac693dd326c05aef56d00aa40
7.124468
apache-2.0
0
3.821
true
false
false
true
0.157228
0.20181
20.181009
0.328156
5.851019
0
0
0.264262
1.901566
0.412292
9.569792
0.147191
5.243425
false
false
2025-03-10
2025-03-10
1
unsloth/Phi-3-mini-4k-instruct-bnb-4bit
maldv_Awqward2.5-32B-Instruct_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/maldv/Awqward2.5-32B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">maldv/Awqward2.5-32B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/maldv__Awqward2.5-32B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
maldv/Awqward2.5-32B-Instruct
fd8f6751645a1923d588f80ec1d8292cb69691a1
46.749023
apache-2.0
4
32.764
true
false
false
true
7.445095
0.82547
82.546975
0.697447
57.207339
0.623112
62.311178
0.340604
12.080537
0.42749
13.869531
0.572307
52.478576
false
false
2024-12-18
2024-12-18
1
maldv/Awqward2.5-32B-Instruct (Merge)
maldv_Lytta2.5-32B-Instruct_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/maldv/Lytta2.5-32B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">maldv/Lytta2.5-32B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/maldv__Lytta2.5-32B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
maldv/Lytta2.5-32B-Instruct
d5ecf702a5c25e0e900fb6e44283864557b03ce5
24.790452
apache-2.0
2
32.764
true
false
false
true
10.239979
0.250795
25.079456
0.559971
37.03154
0.344411
34.441088
0.266779
2.237136
0.376854
4.973438
0.50482
44.980053
false
false
2025-01-02
2025-01-07
1
maldv/Lytta2.5-32B-Instruct (Merge)
maldv_Qwentile2.5-32B-Instruct_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/maldv/Qwentile2.5-32B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">maldv/Qwentile2.5-32B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/maldv__Qwentile2.5-32B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
maldv/Qwentile2.5-32B-Instruct
1cb04716c8aba33838b7f5dad99b23b7f0c6c152
45.900263
apache-2.0
32
32.764
true
false
false
true
7.065083
0.739316
73.931613
0.696284
57.205878
0.521903
52.190332
0.384228
17.897092
0.468229
19.961979
0.587932
54.214687
false
false
2024-12-19
2024-12-19
1
maldv/Qwentile2.5-32B-Instruct (Merge)
maldv_badger-kappa-llama-3-8b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/maldv/badger-kappa-llama-3-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">maldv/badger-kappa-llama-3-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/maldv__badger-kappa-llama-3-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
maldv/badger-kappa-llama-3-8b
aa6863eb816ca6ad29453b8aaf846962c4328998
21.166688
llama3
2
8.03
true
false
false
true
1.918251
0.469464
46.946435
0.508493
30.153239
0.086103
8.610272
0.302852
7.04698
0.37651
4.297135
0.369515
29.94607
false
false
2024-06-02
2024-06-27
0
maldv/badger-kappa-llama-3-8b
maldv_badger-lambda-llama-3-8b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/maldv/badger-lambda-llama-3-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">maldv/badger-lambda-llama-3-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/maldv__badger-lambda-llama-3-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
maldv/badger-lambda-llama-3-8b
8ef157d0d3c12212ca5e70d354869aed90e03f22
20.94385
cc-by-nc-4.0
10
8.03
true
false
false
true
2.222044
0.486076
48.607583
0.496349
28.10305
0.094411
9.441088
0.281879
4.250559
0.375365
4.520573
0.376662
30.740248
false
false
2024-06-10
2024-06-26
0
maldv/badger-lambda-llama-3-8b
maldv_badger-mu-llama-3-8b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/maldv/badger-mu-llama-3-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">maldv/badger-mu-llama-3-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/maldv__badger-mu-llama-3-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
maldv/badger-mu-llama-3-8b
952a269bb1e6c18ee772c6d088e74d305df4425d
20.322171
cc-by-nc-4.0
2
8.03
true
false
false
true
1.809271
0.491946
49.194581
0.514288
30.513965
0.055891
5.589124
0.259228
1.230425
0.355458
5.698958
0.367354
29.705969
false
false
2024-06-27
2024-06-27
0
maldv/badger-mu-llama-3-8b
maldv_badger-writer-llama-3-8b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/maldv/badger-writer-llama-3-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">maldv/badger-writer-llama-3-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/maldv__badger-writer-llama-3-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
maldv/badger-writer-llama-3-8b
1d8134d01af87e994571ae16ccd7b31cce42418f
21.096888
cc-by-nc-4.0
9
8.03
true
false
false
true
2.471621
0.530314
53.031401
0.486389
26.878361
0.075529
7.55287
0.28943
5.257271
0.358094
3.195052
0.375997
30.666371
true
false
2024-06-17
2024-06-26
1
maldv/badger-writer-llama-3-8b (Merge)
marcuscedricridia_Cheng-1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/marcuscedricridia/Cheng-1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">marcuscedricridia/Cheng-1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/marcuscedricridia__Cheng-1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
marcuscedricridia/Cheng-1
cd8c9dd37c67c2e1b7c683fdd5e72b7f08c074b9
36.058303
mit
0
7.613
true
false
false
true
2.072174
0.778883
77.888336
0.552468
36.536367
0.489426
48.942598
0.296141
6.152125
0.407333
9.616667
0.434924
37.213726
true
false
2025-03-10
2025-03-12
0
marcuscedricridia/Cheng-1
marcuscedricridia_Cheng-2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/marcuscedricridia/Cheng-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">marcuscedricridia/Cheng-2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/marcuscedricridia__Cheng-2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
marcuscedricridia/Cheng-2
c22f780671f65fd4566fc9fefca6afdf9f09e3c0
42.848363
1
14.766
false
false
false
true
1.626527
0.833738
83.373782
0.649899
49.975186
0.543807
54.380665
0.345638
12.751678
0.419333
12.016667
0.50133
44.592199
false
false
2025-03-12
2025-03-12
1
marcuscedricridia/Cheng-2 (Merge)
marcuscedricridia_Cheng-2-v1.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/marcuscedricridia/Cheng-2-v1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">marcuscedricridia/Cheng-2-v1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/marcuscedricridia__Cheng-2-v1.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
marcuscedricridia/Cheng-2-v1.1
c007eb2377bc2ce46fe2b75b6e306baae2fe8691
42.679378
1
14.766
false
false
false
true
1.667285
0.826993
82.699349
0.651014
50.248143
0.539275
53.927492
0.343121
12.416107
0.416729
11.491146
0.507646
45.294031
false
false
2025-03-12
2025-03-12
1
marcuscedricridia/Cheng-2-v1.1 (Merge)
marcuscedricridia_Hush-Qwen2.5-7B-MST_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/marcuscedricridia/Hush-Qwen2.5-7B-MST" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">marcuscedricridia/Hush-Qwen2.5-7B-MST</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/marcuscedricridia__Hush-Qwen2.5-7B-MST-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
marcuscedricridia/Hush-Qwen2.5-7B-MST
3f9957f0c0812e781ce27ba6372ba5f1a1b88143
33.66196
1
7.613
false
false
false
true
0.695853
0.748833
74.88331
0.54585
35.350073
0.424471
42.44713
0.303691
7.158837
0.391365
6.98724
0.416307
35.145168
false
false
2025-03-09
2025-03-09
1
marcuscedricridia/Hush-Qwen2.5-7B-MST (Merge)
marcuscedricridia_Hush-Qwen2.5-7B-MST-v1.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/marcuscedricridia/Hush-Qwen2.5-7B-MST-v1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">marcuscedricridia/Hush-Qwen2.5-7B-MST-v1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/marcuscedricridia__Hush-Qwen2.5-7B-MST-v1.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
marcuscedricridia/Hush-Qwen2.5-7B-MST-v1.1
61c74d22df2900512a70e6320446d33c895a3706
35.229152
1
7.613
false
false
false
true
0.674314
0.744487
74.448685
0.55592
36.829826
0.465257
46.52568
0.306208
7.494407
0.407333
9.416667
0.429937
36.659648
false
false
2025-03-09
2025-03-09
1
marcuscedricridia/Hush-Qwen2.5-7B-MST-v1.1 (Merge)
marcuscedricridia_Hush-Qwen2.5-7B-MST-v1.3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/marcuscedricridia/Hush-Qwen2.5-7B-MST-v1.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">marcuscedricridia/Hush-Qwen2.5-7B-MST-v1.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/marcuscedricridia__Hush-Qwen2.5-7B-MST-v1.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
marcuscedricridia/Hush-Qwen2.5-7B-MST-v1.3
4043f735c65a687241b74c8d9e62783376ace3f0
35.730298
1
7.613
false
false
false
true
0.706732
0.70432
70.432009
0.551617
36.451907
0.475831
47.583082
0.314597
8.612975
0.431052
13.08151
0.443983
38.220301
false
false
2025-03-09
2025-03-09
1
marcuscedricridia/Hush-Qwen2.5-7B-MST-v1.3 (Merge)
marcuscedricridia_Hush-Qwen2.5-7B-Preview_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/marcuscedricridia/Hush-Qwen2.5-7B-Preview" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">marcuscedricridia/Hush-Qwen2.5-7B-Preview</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/marcuscedricridia__Hush-Qwen2.5-7B-Preview-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
marcuscedricridia/Hush-Qwen2.5-7B-Preview
3787aba9fef0e1ffd01757d1f3471fc84b948a05
35.127163
0
7.613
false
false
false
true
0.683875
0.796244
79.624397
0.543106
35.328756
0.375378
37.537764
0.311242
8.165548
0.429813
12.726563
0.43642
37.37995
false
false
2025-03-07
2025-03-07
1
marcuscedricridia/Hush-Qwen2.5-7B-Preview (Merge)
marcuscedricridia_Hush-Qwen2.5-7B-RP-v1.4-1M_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/marcuscedricridia/Hush-Qwen2.5-7B-RP-v1.4-1M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">marcuscedricridia/Hush-Qwen2.5-7B-RP-v1.4-1M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/marcuscedricridia__Hush-Qwen2.5-7B-RP-v1.4-1M-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
marcuscedricridia/Hush-Qwen2.5-7B-RP-v1.4-1M
879c76f96c960efcd5db8ef1a98379319e69a5c3
33.317919
2
7.613
false
false
false
true
0.706147
0.772788
77.278842
0.529512
32.681799
0.336858
33.685801
0.298658
6.487696
0.443271
14.942188
0.413481
34.831191
false
false
2025-03-08
2025-03-10
1
marcuscedricridia/Hush-Qwen2.5-7B-RP-v1.4-1M (Merge)
marcuscedricridia_Hush-Qwen2.5-7B-v1.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/marcuscedricridia/Hush-Qwen2.5-7B-v1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">marcuscedricridia/Hush-Qwen2.5-7B-v1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/marcuscedricridia__Hush-Qwen2.5-7B-v1.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
marcuscedricridia/Hush-Qwen2.5-7B-v1.1
ad2151f84cc141aa20a85542308b3a14add5f1fa
35.478354
1
7.613
false
false
false
true
0.690013
0.78895
78.894999
0.538358
34.400036
0.438066
43.806647
0.316275
8.836689
0.417938
11.075521
0.422706
35.856235
false
false
2025-03-08
2025-03-08
1
marcuscedricridia/Hush-Qwen2.5-7B-v1.1 (Merge)
marcuscedricridia_Hush-Qwen2.5-7B-v1.2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/marcuscedricridia/Hush-Qwen2.5-7B-v1.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">marcuscedricridia/Hush-Qwen2.5-7B-v1.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/marcuscedricridia__Hush-Qwen2.5-7B-v1.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
marcuscedricridia/Hush-Qwen2.5-7B-v1.2
f612046f36daa8afd7a1bd396e70d3869dda8638
35.538089
1
7.613
false
false
false
true
0.689471
0.786502
78.650204
0.54025
34.740628
0.440332
44.033233
0.314597
8.612975
0.421875
11.667708
0.419714
35.523788
false
false
2025-03-08
2025-03-08
1
marcuscedricridia/Hush-Qwen2.5-7B-v1.2 (Merge)
marcuscedricridia_Hush-Qwen2.5-7B-v1.3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/marcuscedricridia/Hush-Qwen2.5-7B-v1.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">marcuscedricridia/Hush-Qwen2.5-7B-v1.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/marcuscedricridia__Hush-Qwen2.5-7B-v1.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
marcuscedricridia/Hush-Qwen2.5-7B-v1.3
22714b419eb73afe1fb8016a67240634ddc99897
33.931434
2
7.613
false
false
false
true
0.652612
0.785628
78.562769
0.532689
33.968819
0.332326
33.232628
0.312081
8.277405
0.424635
12.379427
0.434508
37.167553
false
false
2025-03-08
2025-03-08
1
marcuscedricridia/Hush-Qwen2.5-7B-v1.3 (Merge)
marcuscedricridia_Hush-Qwen2.5-7B-v1.4_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/marcuscedricridia/Hush-Qwen2.5-7B-v1.4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">marcuscedricridia/Hush-Qwen2.5-7B-v1.4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/marcuscedricridia__Hush-Qwen2.5-7B-v1.4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
marcuscedricridia/Hush-Qwen2.5-7B-v1.4
dd001376afe7f7c98c584d201bedcc4ad234ad7e
35.18388
1
7.613
false
false
false
true
0.686494
0.783455
78.345457
0.5423
35.058297
0.425982
42.598187
0.311242
8.165548
0.423177
11.430469
0.419548
35.505319
false
false
2025-03-08
2025-03-08
1
marcuscedricridia/Hush-Qwen2.5-7B-v1.4 (Merge)
marcuscedricridia_Qwen2.5-7B-Preview_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/marcuscedricridia/Qwen2.5-7B-Preview" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">marcuscedricridia/Qwen2.5-7B-Preview</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/marcuscedricridia__Qwen2.5-7B-Preview-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
marcuscedricridia/Qwen2.5-7B-Preview
935631778e482a336c34b15fdede64d2571685f0
33.620642
0
7.613
false
false
false
true
0.706765
0.767942
76.794239
0.535978
33.859967
0.344411
34.441088
0.323826
9.8434
0.414031
10.58724
0.425781
36.197917
false
false
2025-03-09
2025-03-09
1
marcuscedricridia/Qwen2.5-7B-Preview (Merge)
marcuscedricridia_Yell-Qwen2.5-7B-Preview_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/marcuscedricridia/Yell-Qwen2.5-7B-Preview" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">marcuscedricridia/Yell-Qwen2.5-7B-Preview</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/marcuscedricridia__Yell-Qwen2.5-7B-Preview-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
marcuscedricridia/Yell-Qwen2.5-7B-Preview
9c9cadc3c25e04502821433e49b17502551de37e
26.147687
0
7.613
false
false
false
true
0.67613
0.58387
58.386969
0.537136
34.763375
0.192598
19.259819
0.28104
4.138702
0.404635
9.246094
0.37982
31.091164
false
false
2025-03-08
2025-03-08
1
marcuscedricridia/Yell-Qwen2.5-7B-Preview (Merge)
marcuscedricridia_Yell-Qwen2.5-7B-Preview-v1.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/marcuscedricridia/Yell-Qwen2.5-7B-Preview-v1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">marcuscedricridia/Yell-Qwen2.5-7B-Preview-v1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/marcuscedricridia__Yell-Qwen2.5-7B-Preview-v1.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
marcuscedricridia/Yell-Qwen2.5-7B-Preview-v1.1
99739c789f117c83527c3940a236d1741c1fae30
26.093953
0
7.613
false
false
false
true
0.671547
0.575701
57.570136
0.534773
34.156629
0.189577
18.957704
0.286074
4.809843
0.405938
9.608854
0.383145
31.46055
false
false
2025-03-08
2025-03-08
1
marcuscedricridia/Yell-Qwen2.5-7B-Preview-v1.1 (Merge)
marcuscedricridia_absolute-o1-7b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/marcuscedricridia/absolute-o1-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">marcuscedricridia/absolute-o1-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/marcuscedricridia__absolute-o1-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
marcuscedricridia/absolute-o1-7b
d11523bb20f692efb61fc72cff79eee70b0ecf0b
36.528608
1
7.613
false
false
false
true
0.689406
0.751556
75.155587
0.546941
35.655763
0.508308
50.830816
0.319631
9.284116
0.411365
10.320573
0.441323
37.924793
false
false
2025-02-27
2025-02-27
1
marcuscedricridia/absolute-o1-7b (Merge)
marcuscedricridia_cursa-o1-7b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/marcuscedricridia/cursa-o1-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">marcuscedricridia/cursa-o1-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/marcuscedricridia__cursa-o1-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
marcuscedricridia/cursa-o1-7b
5ab72cb7de828a3064d69e05008662161cd25684
36.709657
1
7.613
false
false
false
true
0.684587
0.762822
76.282154
0.546586
35.704291
0.495468
49.546828
0.307047
7.606264
0.430063
13.424479
0.439245
37.693927
false
false
2025-02-27
2025-02-27
1
marcuscedricridia/cursa-o1-7b (Merge)
marcuscedricridia_cursa-o1-7b-2-28-2025_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/marcuscedricridia/cursa-o1-7b-2-28-2025" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">marcuscedricridia/cursa-o1-7b-2-28-2025</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/marcuscedricridia__cursa-o1-7b-2-28-2025-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
marcuscedricridia/cursa-o1-7b-2-28-2025
50faf0f516e0afa0530daa7813366a147149b079
35.843864
0
7.613
false
false
false
true
0.699977
0.74671
74.670984
0.538414
34.668303
0.481118
48.111782
0.307047
7.606264
0.427333
12.616667
0.436503
37.389184
false
false
2025-02-27
2025-02-27
1
marcuscedricridia/cursa-o1-7b-2-28-2025 (Merge)
marcuscedricridia_cursa-o1-7b-v1.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/marcuscedricridia/cursa-o1-7b-v1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">marcuscedricridia/cursa-o1-7b-v1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/marcuscedricridia__cursa-o1-7b-v1.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
marcuscedricridia/cursa-o1-7b-v1.1
7932d91ce03991a94866f4d7291c9866b3733906
36.502777
2
7.613
false
false
false
true
0.693566
0.752755
75.275491
0.549256
36.066896
0.498489
49.848943
0.307047
7.606264
0.425875
12.534375
0.439162
37.684693
false
false
2025-02-28
2025-02-28
1
marcuscedricridia/cursa-o1-7b-v1.1 (Merge)
marcuscedricridia_cursa-o1-7b-v1.2-normalize-false_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/marcuscedricridia/cursa-o1-7b-v1.2-normalize-false" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">marcuscedricridia/cursa-o1-7b-v1.2-normalize-false</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/marcuscedricridia__cursa-o1-7b-v1.2-normalize-false-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
marcuscedricridia/cursa-o1-7b-v1.2-normalize-false
96735a68663bcc13ed471697f9b1ade1551312d0
36.799785
0
7.613
false
false
false
true
0.675199
0.761573
76.157263
0.549235
36.12773
0.499245
49.924471
0.307047
7.606264
0.427271
12.808854
0.443567
38.174128
false
false
2025-02-28
2025-02-28
1
marcuscedricridia/cursa-o1-7b-v1.2-normalize-false (Merge)
marcuscedricridia_cursor-o1-7b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/marcuscedricridia/cursor-o1-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">marcuscedricridia/cursor-o1-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/marcuscedricridia__cursor-o1-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
marcuscedricridia/cursor-o1-7b
e2deb6ec40fe3aaa52d8fe30c9ef2123bd8b2abd
20.571431
0
7.616
false
false
false
true
0.780022
0.410688
41.068809
0.500745
28.820714
0.141239
14.123867
0.28104
4.138702
0.410094
10.261719
0.325133
25.014775
false
false
2025-02-26
0
Removed
marcuscedricridia_cursorr-o1.2-7b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/marcuscedricridia/cursorr-o1.2-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">marcuscedricridia/cursorr-o1.2-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/marcuscedricridia__cursorr-o1.2-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
marcuscedricridia/cursorr-o1.2-7b
7aa4af42a5100355d36120451a9b71c11b397097
4.069131
0
7.616
false
false
false
true
0.67642
0.16599
16.598957
0.306813
3.465494
0
0
0.254195
0.559284
0.353844
2.897135
0.108045
0.893913
false
false
2025-02-26
0
Removed
marcuscedricridia_etr1o-explicit-v1.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/marcuscedricridia/etr1o-explicit-v1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">marcuscedricridia/etr1o-explicit-v1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/marcuscedricridia__etr1o-explicit-v1.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
marcuscedricridia/etr1o-explicit-v1.1
5d4d240f0b0abfe4566efde3ec4843a3ca1c8b31
8.109189
0
7.613
false
false
false
true
0.802333
0.288039
28.803907
0.313166
4.190313
0.004532
0.453172
0.277685
3.691275
0.411052
9.348177
0.119515
2.168292
false
false
2025-03-03
0
Removed
marcuscedricridia_etr1o-explicit-v1.2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/marcuscedricridia/etr1o-explicit-v1.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">marcuscedricridia/etr1o-explicit-v1.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/marcuscedricridia__etr1o-explicit-v1.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
marcuscedricridia/etr1o-explicit-v1.2
b00fd4311767963b1e3b12c1f808ebfb428da125
4.805787
0
7.613
false
false
false
true
0.692591
0.150402
15.040204
0.294974
1.982572
0
0
0.260906
1.454139
0.403115
8.95599
0.112616
1.401817
false
false
2025-03-03
0
Removed
marcuscedricridia_etr1o-v1.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/marcuscedricridia/etr1o-v1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">marcuscedricridia/etr1o-v1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/marcuscedricridia__etr1o-v1.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
marcuscedricridia/etr1o-v1.1
6aaf3b43c713ac0964528705bd771b0f128c2c4d
4.967503
0
7.613
false
false
false
true
1.427497
0.15972
15.971954
0.310036
3.321014
0
0
0.256711
0.894855
0.401656
7.873698
0.115691
1.743499
false
false
2025-03-04
2025-03-04
1
marcuscedricridia/etr1o-v1.1 (Merge)
marcuscedricridia_etr1o-v1.2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/marcuscedricridia/etr1o-v1.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">marcuscedricridia/etr1o-v1.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/marcuscedricridia__etr1o-v1.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
marcuscedricridia/etr1o-v1.2
835c7a4f1cd397e72fa7e90d56c0bc02377d0722
39.912802
0
14.766
false
false
false
false
1.782141
0.7287
72.869985
0.634904
47.700911
0.358761
35.876133
0.375839
16.778523
0.471448
18.297656
0.531582
47.953605
false
false
2025-03-04
2025-03-04
1
marcuscedricridia/etr1o-v1.2 (Merge)
marcuscedricridia_fan-o1-7b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/marcuscedricridia/fan-o1-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">marcuscedricridia/fan-o1-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/marcuscedricridia__fan-o1-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
marcuscedricridia/fan-o1-7b
f687a7029b01b79314dce6b31098383f8838c8b9
20.506029
0
7.613
false
false
false
true
0.679657
0.445559
44.555889
0.484906
26.546331
0.161631
16.163142
0.284396
4.58613
0.383365
5.920573
0.327377
25.264111
false
false
2025-02-27
0
Removed
marcuscedricridia_olmner-7b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/marcuscedricridia/olmner-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">marcuscedricridia/olmner-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/marcuscedricridia__olmner-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
marcuscedricridia/olmner-7b
9c7bd318b35ab3df4ca91b5d3a3434f81680034f
35.564173
0
7.616
false
false
false
true
0.65453
0.725378
72.537755
0.547159
35.746844
0.462991
46.299094
0.307886
7.718121
0.437969
14.31276
0.430934
36.770464
false
false
2025-02-27
0
Removed
marcuscedricridia_olmner-della-7b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/marcuscedricridia/olmner-della-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">marcuscedricridia/olmner-della-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/marcuscedricridia__olmner-della-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
marcuscedricridia/olmner-della-7b
6e7d48350df281683fc54afe035a7984fc561306
36.354392
1
7.616
false
false
false
true
0.651306
0.763696
76.369588
0.549123
35.896042
0.496224
49.622356
0.301174
6.823266
0.42076
11.795052
0.43858
37.62005
false
false
2025-02-27
2025-02-27
1
marcuscedricridia/olmner-della-7b (Merge)
marcuscedricridia_olmner-o1-7b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/marcuscedricridia/olmner-o1-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">marcuscedricridia/olmner-o1-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/marcuscedricridia__olmner-o1-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
marcuscedricridia/olmner-o1-7b
b98b6ccf2b20346e760ab3db77d518ce92adf5fb
36.292533
0
7.613
false
false
false
true
0.671385
0.752755
75.275491
0.548087
35.686727
0.492447
49.244713
0.301174
6.823266
0.429906
13.104948
0.43858
37.62005
false
false
2025-02-27
0
Removed
marcuscedricridia_olmner-sbr-7b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/marcuscedricridia/olmner-sbr-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">marcuscedricridia/olmner-sbr-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/marcuscedricridia__olmner-sbr-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
marcuscedricridia/olmner-sbr-7b
8af017146aef2673138387020d311b19b6c60ef1
36.298503
1
7.613
false
false
false
true
0.697445
0.760049
76.004889
0.546164
35.715386
0.494713
49.471299
0.308725
7.829978
0.415365
10.853906
0.44124
37.915559
false
false
2025-03-01
2025-03-01
1
marcuscedricridia/olmner-sbr-7b (Merge)
marcuscedricridia_post-cursa-o1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/marcuscedricridia/post-cursa-o1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">marcuscedricridia/post-cursa-o1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/marcuscedricridia__post-cursa-o1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
marcuscedricridia/post-cursa-o1
ec7e81808cb91d95ebbcc6e35bed6aaf01c0a5bb
36.661699
0
7.613
false
false
false
true
0.690971
0.762822
76.282154
0.547969
35.827289
0.48716
48.716012
0.309564
7.941834
0.435146
13.859896
0.436087
37.343011
false
false
2025-02-27
2025-02-28
1
marcuscedricridia/post-cursa-o1 (Merge)
marcuscedricridia_pre-cursa-o1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/marcuscedricridia/pre-cursa-o1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">marcuscedricridia/pre-cursa-o1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/marcuscedricridia__pre-cursa-o1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
marcuscedricridia/pre-cursa-o1
2912216a41a29bb0cbc2f0234f26c8bd4aeaed74
36.459046
0
7.613
false
false
false
true
0.701943
0.74089
74.088973
0.546169
35.721555
0.503776
50.377644
0.309564
7.941834
0.425969
12.579427
0.442404
38.044843
false
false
2025-02-27
2025-02-28
1
marcuscedricridia/pre-cursa-o1 (Merge)
marcuscedricridia_pre-cursa-o1-v1.2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/marcuscedricridia/pre-cursa-o1-v1.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">marcuscedricridia/pre-cursa-o1-v1.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/marcuscedricridia__pre-cursa-o1-v1.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
marcuscedricridia/pre-cursa-o1-v1.2
f1db05705595b63739a88ef8ca338707e0bbafc6
36.886279
1
7.613
false
false
false
true
0.687912
0.754878
75.487817
0.548679
36.117812
0.506798
50.679758
0.312919
8.389262
0.42724
12.838281
0.440243
37.804743
false
false
2025-02-27
2025-02-28
1
marcuscedricridia/pre-cursa-o1-v1.2 (Merge)
marcuscedricridia_pre-cursa-o1-v1.3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/marcuscedricridia/pre-cursa-o1-v1.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">marcuscedricridia/pre-cursa-o1-v1.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/marcuscedricridia__pre-cursa-o1-v1.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
marcuscedricridia/pre-cursa-o1-v1.3
f33523f9bc4c3524381bede36b8d4d422c7f9cad
36.7122
0
7.613
false
false
false
true
0.683127
0.750682
75.068153
0.545452
35.468597
0.507553
50.755287
0.312919
8.389262
0.427146
12.593229
0.441988
37.99867
false
false
2025-02-27
2025-02-28
1
marcuscedricridia/pre-cursa-o1-v1.3 (Merge)
marcuscedricridia_pre-cursa-o1-v1.4_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/marcuscedricridia/pre-cursa-o1-v1.4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">marcuscedricridia/pre-cursa-o1-v1.4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/marcuscedricridia__pre-cursa-o1-v1.4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
marcuscedricridia/pre-cursa-o1-v1.4
636558c7a765cc4356a79b51ff479a50dd15fc14
36.264047
0
7.613
false
false
false
true
0.691781
0.748783
74.878323
0.549301
35.980445
0.483384
48.338369
0.305369
7.38255
0.42851
12.830469
0.443567
38.174128
false
false
2025-02-27
2025-02-28
1
marcuscedricridia/pre-cursa-o1-v1.4 (Merge)
marcuscedricridia_pre-cursa-o1-v1.6_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/marcuscedricridia/pre-cursa-o1-v1.6" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">marcuscedricridia/pre-cursa-o1-v1.6</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/marcuscedricridia__pre-cursa-o1-v1.6-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
marcuscedricridia/pre-cursa-o1-v1.6
044f5202846d83e8cf352ae999e087661ff31665
36.79518
1
7.613
false
false
false
true
0.670012
0.752755
75.275491
0.547334
35.920913
0.5
50
0.32047
9.395973
0.423365
12.253906
0.441323
37.924793
false
false
2025-02-27
2025-02-28
1
marcuscedricridia/pre-cursa-o1-v1.6 (Merge)
marcuscedricridia_r1o-et_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/marcuscedricridia/r1o-et" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">marcuscedricridia/r1o-et</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/marcuscedricridia__r1o-et-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
marcuscedricridia/r1o-et
c0393cc3b2e0f32383a0c4a0cec27eb2eb36ef7e
14.343384
0
7.613
false
false
false
true
0.671515
0.35968
35.968009
0.42092
18.912935
0.079305
7.930514
0.272651
3.020134
0.357938
2.675521
0.257979
17.553191
false
false
2025-03-02
2025-03-02
1
marcuscedricridia/r1o-et (Merge)
marcuscedricridia_sbr-o1-7b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/marcuscedricridia/sbr-o1-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">marcuscedricridia/sbr-o1-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/marcuscedricridia__sbr-o1-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
marcuscedricridia/sbr-o1-7b
8e92f7967b8dc6e3770834bbcabc031ba3e8b13d
36.69314
0
7.613
false
false
false
true
0.672254
0.745461
74.546093
0.547883
35.779664
0.498489
49.848943
0.310403
8.053691
0.440417
14.652083
0.435505
37.278369
false
false
2025-02-27
2025-02-27
1
marcuscedricridia/sbr-o1-7b (Merge)
marcuscedricridia_stray-r1o-et_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/marcuscedricridia/stray-r1o-et" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">marcuscedricridia/stray-r1o-et</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/marcuscedricridia__stray-r1o-et-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
marcuscedricridia/stray-r1o-et
7930bd53216aaad96325543fc64dc4a2917ebfce
5.127667
0
7.613
false
false
false
true
0.709433
0.156222
15.622216
0.296746
2.644668
0.004532
0.453172
0.261745
1.565996
0.408573
9.438281
0.109375
1.041667
false
false
2025-03-01
2025-03-03
1
marcuscedricridia/stray-r1o-et (Merge)
matouLeLoup_ECE-PRYMMAL-0.5B-FT-EnhancedMUSREnsembleV3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/matouLeLoup/ECE-PRYMMAL-0.5B-FT-EnhancedMUSREnsembleV3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">matouLeLoup/ECE-PRYMMAL-0.5B-FT-EnhancedMUSREnsembleV3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/matouLeLoup__ECE-PRYMMAL-0.5B-FT-EnhancedMUSREnsembleV3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
matouLeLoup/ECE-PRYMMAL-0.5B-FT-EnhancedMUSREnsembleV3
60c5853d376d4b62b19dd4c4741224d0246ec5b4
7.224121
0
0.494
false
false
false
false
1.703544
0.187322
18.732186
0.323912
7.918512
0.026435
2.643505
0.260906
1.454139
0.375208
4.601042
0.171958
7.995346
false
false
2024-11-01
0
Removed
matouLeLoup_ECE-PRYMMAL-0.5B-FT-MUSR-ENSEMBLE-V2Mathis_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/matouLeLoup/ECE-PRYMMAL-0.5B-FT-MUSR-ENSEMBLE-V2Mathis" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">matouLeLoup/ECE-PRYMMAL-0.5B-FT-MUSR-ENSEMBLE-V2Mathis</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/matouLeLoup__ECE-PRYMMAL-0.5B-FT-MUSR-ENSEMBLE-V2Mathis-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
matouLeLoup/ECE-PRYMMAL-0.5B-FT-MUSR-ENSEMBLE-V2Mathis
3fd229bcc3b4d2502ed7f3bdd48ccb5c97e83212
7.224121
0
0.494
false
false
false
false
1.72903
0.187322
18.732186
0.323912
7.918512
0.026435
2.643505
0.260906
1.454139
0.375208
4.601042
0.171958
7.995346
false
false
2024-10-31
0
Removed
matouLeLoup_ECE-PRYMMAL-0.5B-FT-V4-MUSR-ENSEMBLE-Mathis_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/matouLeLoup/ECE-PRYMMAL-0.5B-FT-V4-MUSR-ENSEMBLE-Mathis" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">matouLeLoup/ECE-PRYMMAL-0.5B-FT-V4-MUSR-ENSEMBLE-Mathis</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/matouLeLoup__ECE-PRYMMAL-0.5B-FT-V4-MUSR-ENSEMBLE-Mathis-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
matouLeLoup/ECE-PRYMMAL-0.5B-FT-V4-MUSR-ENSEMBLE-Mathis
455945ed4318bbeae008a253f877f56a68291b8b
7.224121
0
0.494
false
false
false
false
1.728122
0.187322
18.732186
0.323912
7.918512
0.026435
2.643505
0.260906
1.454139
0.375208
4.601042
0.171958
7.995346
false
false
2024-10-31
0
Removed
matouLeLoup_ECE-PRYMMAL-0.5B-FT-V4-MUSR-Mathis_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/matouLeLoup/ECE-PRYMMAL-0.5B-FT-V4-MUSR-Mathis" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">matouLeLoup/ECE-PRYMMAL-0.5B-FT-V4-MUSR-Mathis</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/matouLeLoup__ECE-PRYMMAL-0.5B-FT-V4-MUSR-Mathis-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
matouLeLoup/ECE-PRYMMAL-0.5B-FT-V4-MUSR-Mathis
dd86c3d7f77748a0ba18d911ceb93358a69ce160
7.257345
1
0.494
false
false
false
false
1.820185
0.188246
18.824608
0.323279
8.079577
0.02719
2.719033
0.263423
1.789709
0.368479
4.126563
0.172041
8.00458
false
false
2024-10-25
2024-10-31
0
matouLeLoup/ECE-PRYMMAL-0.5B-FT-V4-MUSR-Mathis
matouLeLoup_ECE-PRYMMAL-0.5B-FT-V5-MUSR-Mathis_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/matouLeLoup/ECE-PRYMMAL-0.5B-FT-V5-MUSR-Mathis" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">matouLeLoup/ECE-PRYMMAL-0.5B-FT-V5-MUSR-Mathis</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/matouLeLoup__ECE-PRYMMAL-0.5B-FT-V5-MUSR-Mathis-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
matouLeLoup/ECE-PRYMMAL-0.5B-FT-V5-MUSR-Mathis
7a9d848188a674302d64a865786d4508be19571a
5.976392
0
0.63
false
false
false
true
2.197902
0.165215
16.521496
0.302373
3.083352
0.018882
1.888218
0.256711
0.894855
0.427302
12.179427
0.111619
1.291002
false
false
2024-11-12
0
Removed
mattshumer_Reflection-Llama-3.1-70B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/mattshumer/Reflection-Llama-3.1-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mattshumer/Reflection-Llama-3.1-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mattshumer__Reflection-Llama-3.1-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mattshumer/Reflection-Llama-3.1-70B
458962ed801fac4eadd01a91a2029a3a82f4cd84
24.392555
llama3.1
1,714
70.554
true
false
false
true
39.031041
0.004521
0.452134
0.645001
47.866237
0.214502
21.450151
0.363255
15.100671
0.457656
17.540365
0.495512
43.945774
false
false
2024-09-05
2024-12-25
2
meta-llama/Meta-Llama-3.1-70B
mattshumer_ref_70_e3_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/mattshumer/ref_70_e3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mattshumer/ref_70_e3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mattshumer__ref_70_e3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mattshumer/ref_70_e3
5d2d9dbb9e0bf61879255f63f1b787296fe524cc
35.3956
llama3.1
58
70.554
true
false
false
true
64.103974
0.629432
62.943213
0.650084
49.274467
0.279456
27.945619
0.33557
11.409396
0.43276
12.995052
0.530253
47.805851
false
false
2024-09-08
2024-09-08
2
meta-llama/Meta-Llama-3.1-70B
maywell_Qwen2-7B-Multilingual-RP_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/maywell/Qwen2-7B-Multilingual-RP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">maywell/Qwen2-7B-Multilingual-RP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/maywell__Qwen2-7B-Multilingual-RP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
maywell/Qwen2-7B-Multilingual-RP
487e8f0498419e4d1188f661dbb63bd629be4638
23.450879
apache-2.0
55
7.616
true
false
false
true
1.918826
0.434718
43.471766
0.506206
30.543561
0.22432
22.432024
0.29698
6.263982
0.369563
6.228646
0.385888
31.765293
false
false
2024-06-24
2024-09-05
0
maywell/Qwen2-7B-Multilingual-RP
meditsolutions_Llama-3.1-MedIT-SUN-8B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meditsolutions/Llama-3.1-MedIT-SUN-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meditsolutions/Llama-3.1-MedIT-SUN-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meditsolutions__Llama-3.1-MedIT-SUN-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meditsolutions/Llama-3.1-MedIT-SUN-8B
0c11abbaa40e76b538b8c0f9c50e965078999087
30.19416
llama3.1
2
8.03
true
false
false
true
1.4261
0.783729
78.372939
0.518692
32.001651
0.209215
20.92145
0.308725
7.829978
0.405625
9.636458
0.391622
32.402482
false
false
2024-11-06
2024-11-06
1
meditsolutions/Llama-3.1-MedIT-SUN-8B (Merge)
meditsolutions_Llama-3.2-SUN-1B-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaMedITForCausalLM
<a target="_blank" href="https://huggingface.co/meditsolutions/Llama-3.2-SUN-1B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meditsolutions/Llama-3.2-SUN-1B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meditsolutions__Llama-3.2-SUN-1B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meditsolutions/Llama-3.2-SUN-1B-Instruct
538477c528ecd80f9537b0d4ea730b7b9b529115
15.524297
llama3.2
4
1.498
true
false
false
true
0.703488
0.641297
64.129731
0.34739
9.183739
0.070997
7.099698
0.24245
0
0.351365
4.053906
0.178108
8.678709
false
false
2024-11-27
2024-11-27
1
meditsolutions/Llama-3.2-SUN-1B-Instruct (Merge)
meditsolutions_Llama-3.2-SUN-1B-chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meditsolutions/Llama-3.2-SUN-1B-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meditsolutions/Llama-3.2-SUN-1B-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meditsolutions__Llama-3.2-SUN-1B-chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meditsolutions/Llama-3.2-SUN-1B-chat
a67791cfc31d09c3e96bd8c62a386f6107378087
13.641366
llama3.2
2
1.498
true
false
false
true
1.095439
0.548174
54.81744
0.351446
8.690238
0.064199
6.41994
0.261745
1.565996
0.324917
1.047917
0.18376
9.306664
false
false
2024-11-03
2024-11-07
1
meditsolutions/Llama-3.2-SUN-1B-chat (Merge)
meditsolutions_Llama-3.2-SUN-2.4B-checkpoint-26000_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meditsolutions/Llama-3.2-SUN-2.4B-checkpoint-26000" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meditsolutions/Llama-3.2-SUN-2.4B-checkpoint-26000</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meditsolutions__Llama-3.2-SUN-2.4B-checkpoint-26000-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meditsolutions/Llama-3.2-SUN-2.4B-checkpoint-26000
1300885555ca8bbed20a57cf0ec9f7ae014200c3
8.143485
llama3.2
2
2.209
true
false
false
true
1.626341
0.281394
28.139448
0.301775
2.895305
0.018127
1.812689
0.277685
3.691275
0.410333
8.491667
0.134475
3.830526
false
false
2024-09-27
2024-10-04
1
meditsolutions/Llama-3.2-SUN-2.4B-checkpoint-26000 (Merge)
meditsolutions_Llama-3.2-SUN-2.4B-checkpoint-34800_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meditsolutions/Llama-3.2-SUN-2.4B-checkpoint-34800" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meditsolutions/Llama-3.2-SUN-2.4B-checkpoint-34800</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meditsolutions__Llama-3.2-SUN-2.4B-checkpoint-34800-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meditsolutions/Llama-3.2-SUN-2.4B-checkpoint-34800
ef65f05f577a69a1992349c8d33c96cd099844f7
8.193103
llama3.2
2
2.209
true
false
false
true
1.631066
0.250095
25.00953
0.316112
5.46618
0.010574
1.057402
0.286074
4.809843
0.40224
8.846615
0.135721
3.969046
false
false
2024-09-27
2024-10-05
1
meditsolutions/Llama-3.2-SUN-2.4B-checkpoint-34800 (Merge)
meditsolutions_Llama-3.2-SUN-2.4B-v1.0.0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meditsolutions/Llama-3.2-SUN-2.4B-v1.0.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meditsolutions/Llama-3.2-SUN-2.4B-v1.0.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meditsolutions__Llama-3.2-SUN-2.4B-v1.0.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meditsolutions/Llama-3.2-SUN-2.4B-v1.0.0
b8a31c62ab4acbd4c645fd882d899c4ec7280677
13.31713
llama3.2
2
2.472
true
false
false
true
6.019643
0.563687
56.368657
0.339083
7.211668
0.062689
6.268882
0.25755
1.006711
0.320948
3.01849
0.154255
6.028369
false
false
2024-09-27
2024-10-20
1
meditsolutions/Llama-3.2-SUN-2.4B-v1.0.0 (Merge)