eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
7 values
T
stringclasses
7 values
Weight type
stringclasses
3 values
Architecture
stringclasses
63 values
Model
stringlengths
355
689
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
0.74
52
Hub License
stringclasses
27 values
Hub ❤️
int64
0
6k
#Params (B)
float64
-1
141
Available on the hub
bool
2 classes
MoE
bool
2 classes
Flagged
bool
2 classes
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.03
107
IFEval Raw
float64
0
0.9
IFEval
float64
0
90
BBH Raw
float64
0.24
0.75
BBH
float64
0.25
64.1
MATH Lvl 5 Raw
float64
0
0.52
MATH Lvl 5
float64
0
52.4
GPQA Raw
float64
0.21
0.47
GPQA
float64
0
29.4
MUSR Raw
float64
0.29
0.6
MUSR
float64
0
38.5
MMLU-PRO Raw
float64
0.1
0.73
MMLU-PRO
float64
0
70
Merged
bool
2 classes
Official Providers
bool
2 classes
Upload To Hub Date
stringclasses
488 values
Submission Date
stringclasses
228 values
Generation
int64
0
10
Base Model
stringlengths
4
102
netcat420_DeepSeek-R1-MFANN-TIES-unretrained-7b_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/DeepSeek-R1-MFANN-TIES-unretrained-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/DeepSeek-R1-MFANN-TIES-unretrained-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__DeepSeek-R1-MFANN-TIES-unretrained-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/DeepSeek-R1-MFANN-TIES-unretrained-7b
836a8aa29ce3e9d46c4f1ab2312f20cac9802649
5.871475
0
7.616
false
false
false
true
0.6782
0.258688
25.868806
0.308599
4.561587
0.007553
0.755287
0.255034
0.671141
0.352729
1.757812
0.114528
1.614214
false
false
2025-01-22
2025-01-22
1
netcat420/DeepSeek-R1-MFANN-TIES-unretrained-7b (Merge)
netcat420_Llama3.1-MFANN-8b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/Llama3.1-MFANN-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/Llama3.1-MFANN-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__Llama3.1-MFANN-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/Llama3.1-MFANN-8b
6714fe00996d2679e9325b503ab991f4ecc0273d
13.066711
llama3.1
0
8.03
true
false
false
false
0.700506
0.296957
29.695652
0.428115
19.286684
0.026435
2.643505
0.287752
5.033557
0.337906
2.571615
0.272523
19.169252
false
false
2024-12-23
2024-12-23
1
netcat420/Llama3.1-MFANN-8b (Merge)
netcat420_MFANN-Llama3.1-Abliterated-SLERP-TIES-V2_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN-Llama3.1-Abliterated-SLERP-TIES-V2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN-Llama3.1-Abliterated-SLERP-TIES-V2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN-Llama3.1-Abliterated-SLERP-TIES-V2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN-Llama3.1-Abliterated-SLERP-TIES-V2
0e649dd355ad7d562f9346c96642c24eff35338e
19.213728
apache-2.0
0
8.03
true
false
false
false
0.704113
0.42098
42.097967
0.492376
26.93837
0.076284
7.628399
0.29698
6.263982
0.37276
4.328385
0.352227
28.025266
true
false
2024-11-08
2024-11-09
0
netcat420/MFANN-Llama3.1-Abliterated-SLERP-TIES-V2
netcat420_MFANN-Llama3.1-Abliterated-SLERP-TIES-V3_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN-Llama3.1-Abliterated-SLERP-TIES-V3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN-Llama3.1-Abliterated-SLERP-TIES-V3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN-Llama3.1-Abliterated-SLERP-TIES-V3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN-Llama3.1-Abliterated-SLERP-TIES-V3
381cf003a5e28d2b273226364b568cc60b857b5b
19.22203
2
8.03
false
false
false
false
0.72091
0.423802
42.380218
0.491402
26.978851
0.075529
7.55287
0.29698
6.263982
0.374062
4.491146
0.348986
27.665115
false
false
2024-11-25
2024-11-26
1
netcat420/MFANN-Llama3.1-Abliterated-SLERP-TIES-V3 (Merge)
netcat420_MFANN-Llama3.1-Abliterated-SLERP-V4_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN-Llama3.1-Abliterated-SLERP-V4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN-Llama3.1-Abliterated-SLERP-V4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN-Llama3.1-Abliterated-SLERP-V4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN-Llama3.1-Abliterated-SLERP-V4
af160f1cf089ccbcbf00f99b951797a1f3daeb04
19.412059
apache-2.0
0
8.03
true
false
false
false
0.722467
0.416883
41.688276
0.490897
26.706074
0.068731
6.873112
0.305369
7.38255
0.382094
5.861719
0.351646
27.960624
true
false
2024-11-08
2024-11-09
0
netcat420/MFANN-Llama3.1-Abliterated-SLERP-V4
netcat420_MFANN-Llama3.1-Abliterated-SLERP-V5_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN-Llama3.1-Abliterated-SLERP-V5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN-Llama3.1-Abliterated-SLERP-V5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN-Llama3.1-Abliterated-SLERP-V5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN-Llama3.1-Abliterated-SLERP-V5
e0502b359816fe3ecd4f7206e5230398604fdfe2
19.493723
2
8.03
false
false
false
false
0.705626
0.432895
43.289472
0.495189
27.367143
0.081571
8.1571
0.293624
5.816555
0.378125
5.165625
0.344498
27.166445
false
false
2024-11-25
2024-11-26
1
netcat420/MFANN-Llama3.1-Abliterated-SLERP-V5 (Merge)
netcat420_MFANN-Llama3.1-Abliterated-Slerp-TIES_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN-Llama3.1-Abliterated-Slerp-TIES" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN-Llama3.1-Abliterated-Slerp-TIES</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN-Llama3.1-Abliterated-Slerp-TIES-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN-Llama3.1-Abliterated-Slerp-TIES
dbe0a3b69206c042de2b0a96fc156feeecaa49c7
19.134712
2
8.03
false
false
false
false
0.773314
0.429347
42.934746
0.496751
27.599829
0.059668
5.966767
0.291946
5.592841
0.368698
4.58724
0.353142
28.126847
false
false
2024-10-28
2024-10-29
1
netcat420/MFANN-Llama3.1-Abliterated-Slerp-TIES (Merge)
netcat420_MFANN-Llama3.1-Abliterated-Slerp-V3.2_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN-Llama3.1-Abliterated-Slerp-V3.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN-Llama3.1-Abliterated-Slerp-V3.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN-Llama3.1-Abliterated-Slerp-V3.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN-Llama3.1-Abliterated-Slerp-V3.2
56abb76e65cbf9dc49af662b09894d119d49705a
18.91525
1
8.03
false
false
false
false
0.741419
0.412811
41.281134
0.497825
27.774394
0.061934
6.193353
0.287752
5.033557
0.375427
5.128385
0.352726
28.080674
false
false
2024-10-28
2024-10-29
1
netcat420/MFANN-Llama3.1-Abliterated-Slerp-V3.2 (Merge)
netcat420_MFANN-SFT_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN-SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN-SFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN-SFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN-SFT
247f2ce5841d38cef59b73a7f8af857627d254bf
17.919418
2
8.03
false
false
false
false
0.6558
0.368223
36.822298
0.485189
26.208533
0.058912
5.891239
0.316275
8.836689
0.372542
3.801042
0.33361
25.956708
false
false
2024-12-16
2024-12-20
1
netcat420/MFANN-SFT (Merge)
netcat420_MFANN-abliterated-phi2-merge-unretrained_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN-abliterated-phi2-merge-unretrained" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN-abliterated-phi2-merge-unretrained</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN-abliterated-phi2-merge-unretrained-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN-abliterated-phi2-merge-unretrained
cfc2d479871655f620dd741d8938b0e4b6df1d3e
9.538074
0
2.775
false
false
false
true
0.447636
0.300504
30.050377
0.410413
17.590366
0.022659
2.265861
0.260906
1.454139
0.318344
0.559635
0.147773
5.308067
false
false
2025-01-15
2025-01-15
1
netcat420/MFANN-abliterated-phi2-merge-unretrained (Merge)
netcat420_MFANN-llama3.1-Abliterated-SLERP_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN-llama3.1-Abliterated-SLERP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN-llama3.1-Abliterated-SLERP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN-llama3.1-Abliterated-SLERP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN-llama3.1-Abliterated-SLERP
0c7b2916727e6c28bbca2aa613b8247b66905915
13.90681
1
8.03
false
false
false
false
0.773579
0.259063
25.906262
0.45745
22.280625
0.049849
4.984894
0.27349
3.131991
0.380917
5.714583
0.292803
21.422503
false
false
2024-09-25
2024-10-07
1
netcat420/MFANN-llama3.1-Abliterated-SLERP (Merge)
netcat420_MFANN-llama3.1-abliterated-SLERP-v3_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN-llama3.1-abliterated-SLERP-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN-llama3.1-abliterated-SLERP-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN-llama3.1-abliterated-SLERP-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN-llama3.1-abliterated-SLERP-v3
f90a20024060942826302c30860572c227dd4013
18.080026
llama3.1
1
8.03
true
false
false
false
0.79157
0.379939
37.993856
0.493058
27.18727
0.066465
6.646526
0.291107
5.480984
0.366031
3.053906
0.353059
28.117612
true
false
2024-10-07
2024-10-07
1
netcat420/MFANN-llama3.1-abliterated-SLERP-v3 (Merge)
netcat420_MFANN-llama3.1-abliterated-SLERP-v3.1_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN-llama3.1-abliterated-SLERP-v3.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN-llama3.1-abliterated-SLERP-v3.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN-llama3.1-abliterated-SLERP-v3.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN-llama3.1-abliterated-SLERP-v3.1
6d306eb66466cb8e1456a36f3895890a117e91e4
19.029174
llama3.1
1
8.03
true
false
false
false
1.749808
0.420155
42.015519
0.492069
27.026316
0.073263
7.326284
0.292785
5.704698
0.368635
3.846094
0.354305
28.256132
true
false
2024-10-08
2024-10-17
1
netcat420/MFANN-llama3.1-abliterated-SLERP-v3.1 (Merge)
netcat420_MFANN-llama3.1-abliterated-v2_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN-llama3.1-abliterated-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN-llama3.1-abliterated-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN-llama3.1-abliterated-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN-llama3.1-abliterated-v2
3d0a5d3634726e1a63ac84bee561b346960ca1d7
19.745935
0
8.03
false
false
false
false
0.824524
0.442911
44.291147
0.494083
27.353618
0.072508
7.250755
0.292785
5.704698
0.384542
6.201042
0.349069
27.67435
false
false
2024-10-04
2024-10-07
1
netcat420/MFANN-llama3.1-abliterated-v2 (Merge)
netcat420_MFANN-phigments-slerp-V2_float16
float16
🤝 base merges and moerges
🤝
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN-phigments-slerp-V2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN-phigments-slerp-V2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN-phigments-slerp-V2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN-phigments-slerp-V2
94596dab22ab78f0d2ec00b8e33c8fa98581ad0f
16.004358
0
2.78
false
false
false
false
0.40812
0.32316
32.316033
0.482728
26.927492
0.015861
1.586103
0.272651
3.020134
0.403729
13.099479
0.271692
19.076906
false
false
2024-10-23
2024-10-26
1
netcat420/MFANN-phigments-slerp-V2 (Merge)
netcat420_MFANN-phigments-slerp-V3.2_float16
float16
🤝 base merges and moerges
🤝
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN-phigments-slerp-V3.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN-phigments-slerp-V3.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN-phigments-slerp-V3.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN-phigments-slerp-V3.2
3fdb0794f6eb757bf2e4a6f378caed1863e9074c
16.340009
0
2.78
false
false
false
false
0.289154
0.352436
35.243598
0.480855
26.918035
0.026435
2.643505
0.283557
4.474273
0.370771
9.813021
0.270529
18.947621
false
false
2025-02-06
2025-02-06
1
netcat420/MFANN-phigments-slerp-V3.2 (Merge)
netcat420_MFANN-phigments-slerp-V3.3_float16
float16
🤝 base merges and moerges
🤝
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN-phigments-slerp-V3.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN-phigments-slerp-V3.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN-phigments-slerp-V3.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN-phigments-slerp-V3.3
c05613efa47825622aa16e2b4f881549cdbec997
17.041853
0
2.78
false
false
false
false
0.295739
0.369097
36.909733
0.48953
28.170622
0.02568
2.567976
0.275168
3.355705
0.389219
11.21901
0.280253
20.028073
false
false
2025-02-06
2025-02-06
1
netcat420/MFANN-phigments-slerp-V3.3 (Merge)
netcat420_MFANN3b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN3b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN3b
ba03f833e89c335a5ee8f523a95892a15d22070e
12.463626
mit
0
2.78
true
false
false
false
0.375993
0.252444
25.244352
0.443313
22.239211
0.010574
1.057402
0.291946
5.592841
0.360604
6.142188
0.230552
14.505762
false
false
2024-12-13
2024-12-14
1
netcat420/MFANN3b (Merge)
netcat420_MFANN3bv0.15_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN3bv0.15" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN3bv0.15</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN3bv0.15-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN3bv0.15
20dbdfb9154cc2f6d43651fc8cea63a120220dc7
11.811262
mit
0
2.78
true
false
false
false
0.467138
0.201211
20.121057
0.453931
23.469347
0.019637
1.963746
0.251678
0.223714
0.395792
8.773958
0.246842
16.315751
false
false
2024-07-04
2024-07-05
0
netcat420/MFANN3bv0.15
netcat420_MFANN3bv0.18_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN3bv0.18" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN3bv0.18</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN3bv0.18-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN3bv0.18
3e792e3413217b63ea9caa0e8b8595fbeb236a69
12.574348
mit
0
2.78
true
false
false
false
0.482514
0.220645
22.064456
0.451437
23.073404
0.020393
2.039275
0.25755
1.006711
0.402365
10.595573
0.25
16.666667
false
false
2024-07-25
2024-07-25
0
netcat420/MFANN3bv0.18
netcat420_MFANN3bv0.19_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN3bv0.19" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN3bv0.19</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN3bv0.19-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN3bv0.19
073d42274686f5cb6ef6ff9f6ade24eab198e1f2
12.503372
0
2.78
false
false
false
false
0.486488
0.225815
22.581528
0.45158
22.907055
0.017372
1.73716
0.25755
1.006711
0.402396
9.899479
0.251995
16.888298
false
false
2024-08-04
2024-08-08
0
netcat420/MFANN3bv0.19
netcat420_MFANN3bv0.20_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN3bv0.20" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN3bv0.20</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN3bv0.20-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN3bv0.20
ac8ba24559cbdb5704d77b602580d911c265fdee
12.383183
mit
0
2.78
true
false
false
false
0.509418
0.219346
21.934578
0.449337
22.790711
0.015106
1.510574
0.259228
1.230425
0.407729
10.166146
0.25
16.666667
false
false
2024-08-29
2024-08-29
2
netcat420/MFANN3bv0.19.12 (Merge)
netcat420_MFANN3bv0.21_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN3bv0.21" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN3bv0.21</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN3bv0.21-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN3bv0.21
8e78416dce916b69247fa03bd587369d0dade5ed
11.703842
mit
0
2.78
true
false
false
false
1.446786
0.191519
19.15185
0.447002
22.583426
0.01284
1.283988
0.264262
1.901566
0.375948
9.826823
0.239279
15.475399
false
false
2024-09-23
2024-09-24
1
netcat420/MFANN3bv0.21 (Merge)
netcat420_MFANN3bv0.22_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN3bv0.22" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN3bv0.22</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN3bv0.22-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN3bv0.22
20c26f267ebe62ef1da037a5b840a304cb8d740b
11.916627
mit
0
2.78
true
false
false
false
0.395668
0.197938
19.793814
0.448511
22.491537
0.006042
0.60423
0.261745
1.565996
0.352135
10.183594
0.251745
16.860594
false
false
2024-10-25
2024-10-26
0
netcat420/MFANN3bv0.22
netcat420_MFANN3bv0.23_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN3bv0.23" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN3bv0.23</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN3bv0.23-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN3bv0.23
93eacd43dcb307016e22a4d9f9f8deef49cd9111
11.183676
0
2.78
false
false
false
false
0.388184
0.204808
20.480769
0.449542
22.696341
0.009063
0.906344
0.251678
0.223714
0.34274
7.042448
0.241772
15.752438
false
false
2024-11-06
2024-11-07
0
netcat420/MFANN3bv0.23
netcat420_MFANN3bv0.24_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN3bv0.24" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN3bv0.24</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN3bv0.24-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN3bv0.24
55813c2586488a2e7be5883f7e695396f5629d3e
11.520757
mit
0
2.78
true
false
false
false
0.384537
0.220045
22.004504
0.440735
21.545385
0.010574
1.057402
0.258389
1.118568
0.352073
8.375781
0.235206
15.022902
false
false
2024-11-21
2024-11-22
0
netcat420/MFANN3bv0.24
netcat420_MFANN3bv1.1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN3bv1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN3bv1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN3bv1.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN3bv1.1
2089ba193df157575eae482f0df4907fd3ea14ae
6.482983
0
2.775
false
false
false
true
0.38816
0.250695
25.069482
0.339709
8.391709
0.009819
0.981873
0.266779
2.237136
0.322313
0.455729
0.115858
1.761968
false
false
2025-01-03
2025-01-03
0
netcat420/MFANN3bv1.1
netcat420_MFANN3bv1.2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN3bv1.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN3bv1.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN3bv1.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN3bv1.2
0643ded63b35beb43caf2d2c0bd8003fdb81b0ec
7.770948
mit
0
2.775
true
false
false
true
0.399613
0.268605
26.860508
0.365993
11.121792
0.009063
0.906344
0.263423
1.789709
0.315552
0.94401
0.14503
5.003324
false
false
2025-01-21
2025-01-22
1
netcat420/MFANN3bv1.2 (Merge)
netcat420_MFANN3bv1.3_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN3bv1.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN3bv1.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN3bv1.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN3bv1.3
dab137e547fa2e9f23dbf74ec602acfc6131e5a0
11.470315
0
2.78
false
false
false
false
0.412018
0.254667
25.466651
0.445631
22.637009
0.017372
1.73716
0.25755
1.006711
0.329875
3.801042
0.22756
14.173316
false
false
2025-01-31
2025-02-01
0
netcat420/MFANN3bv1.3
netcat420_MFANN3bv1.4_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN3bv1.4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN3bv1.4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN3bv1.4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN3bv1.4
bed48b1506d7f8866f7d23c89faee4ff76690f5a
16.35913
mit
0
2.78
true
false
false
false
0.329459
0.352436
35.243598
0.480855
26.918035
0.028701
2.870091
0.282718
4.362416
0.370771
9.813021
0.270529
18.947621
false
false
2025-02-06
2025-02-06
1
netcat420/MFANN3bv1.4 (Merge)
netcat420_MFANNv0.19_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANNv0.19" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANNv0.19</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANNv0.19-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANNv0.19
af26a25549b7ad291766c479bebda58f15fbff42
14.187656
llama3.1
0
8.03
true
false
false
false
0.957079
0.305674
30.56745
0.473138
24.924106
0.029456
2.945619
0.307047
7.606264
0.352698
2.720573
0.247257
16.361924
false
false
2024-07-27
2024-07-27
0
netcat420/MFANNv0.19
netcat420_MFANNv0.20_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANNv0.20" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANNv0.20</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANNv0.20-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANNv0.20
e612e57c933870b8990ac2bc217c434f3ffc84bd
16.524597
llama3.1
0
8.03
true
false
false
false
0.867884
0.347865
34.786478
0.457443
22.401697
0.053625
5.362538
0.290268
5.369128
0.387396
6.757813
0.320229
24.469932
false
false
2024-08-07
2024-08-08
0
netcat420/MFANNv0.20
netcat420_MFANNv0.21_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANNv0.21" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANNv0.21</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANNv0.21-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANNv0.21
8c71d0eb419f54c489fa1ddf55d4bd18a1fb27d8
15.898755
llama3
0
8.03
true
false
false
false
0.879411
0.32331
32.330993
0.457637
22.058432
0.058157
5.81571
0.278523
3.803132
0.399333
8.816667
0.303108
22.567598
false
false
2024-08-31
2024-09-02
2
netcat420/MFANNv0.20.12 (Merge)
netcat420_MFANNv0.22.1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANNv0.22.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANNv0.22.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANNv0.22.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANNv0.22.1
98108142480b802a3e1bb27e3d47075a4ea3a4f1
15.71773
llama3.1
0
8.03
true
false
false
false
0.840529
0.308947
30.894693
0.466089
23.602793
0.056647
5.664653
0.276007
3.467562
0.375302
4.646094
0.334275
26.030585
false
false
2024-10-04
2024-10-05
1
netcat420/MFANNv0.22.1 (Merge)
netcat420_MFANNv0.23_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANNv0.23" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANNv0.23</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANNv0.23-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANNv0.23
cf7fb44a8c858602d7fcba58adcbd514c7e08ba4
16.652656
llama3.1
1
8.03
true
false
false
false
0.81038
0.312744
31.274352
0.48981
27.042345
0.049849
4.984894
0.284396
4.58613
0.376792
5.498958
0.338763
26.529255
false
false
2024-10-27
2024-10-29
1
netcat420/MFANNv0.23 (Merge)
netcat420_MFANNv0.24_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANNv0.24" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANNv0.24</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANNv0.24-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANNv0.24
57ce382fede1adce68bdb95a386255fa363077d7
16.398374
llama3.1
1
8.03
true
false
false
false
0.743903
0.316241
31.624091
0.479027
25.351725
0.061178
6.117825
0.284396
4.58613
0.375396
4.624479
0.334774
26.085993
false
false
2024-11-07
2024-11-09
1
netcat420/MFANNv0.24 (Merge)
netcat420_MFANNv0.25_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANNv0.25" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANNv0.25</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANNv0.25-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANNv0.25
cff1e1772fc7f4f3e68ad53d8589df3f52556e38
16.559201
llama3.1
2
8.03
true
false
false
false
0.712288
0.346666
34.666574
0.479407
25.409784
0.055891
5.589124
0.280201
4.026846
0.368792
3.632292
0.334275
26.030585
false
false
2024-11-25
2024-11-26
1
netcat420/MFANNv0.25 (Merge)
netcat420_Qwen2.5-7B-nerd-uncensored-v0.9-MFANN_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/Qwen2.5-7B-nerd-uncensored-v0.9-MFANN" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/Qwen2.5-7B-nerd-uncensored-v0.9-MFANN</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__Qwen2.5-7B-nerd-uncensored-v0.9-MFANN-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/Qwen2.5-7B-nerd-uncensored-v0.9-MFANN
cc114e017a8d69c0940fe3bdde0f2e1cafeb1078
23.587517
apache-2.0
2
7.616
true
false
false
true
0.694562
0.587841
58.784137
0.523666
32.266986
0.070997
7.099698
0.28104
4.138702
0.392573
6.971615
0.390376
32.263963
false
false
2025-01-02
2025-01-02
0
netcat420/Qwen2.5-7B-nerd-uncensored-v0.9-MFANN
netcat420_Qwen2.5-7b-MFANN-slerp_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/Qwen2.5-7b-MFANN-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/Qwen2.5-7b-MFANN-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__Qwen2.5-7b-MFANN-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/Qwen2.5-7b-MFANN-slerp
7760e13b9b6c654ce6c5509f865e4e54f8a00ef6
25.581831
mit
2
7.616
true
false
false
true
0.647779
0.653212
65.321237
0.508873
30.361031
0.159366
15.936556
0.295302
6.040268
0.407302
8.979427
0.341672
26.852467
false
false
2025-01-25
2025-01-25
0
netcat420/Qwen2.5-7b-MFANN-slerp
netcat420_Qwen2.5-7b-nerd-uncensored-MFANN-slerp_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/Qwen2.5-7b-nerd-uncensored-MFANN-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/Qwen2.5-7b-nerd-uncensored-MFANN-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__Qwen2.5-7b-nerd-uncensored-MFANN-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/Qwen2.5-7b-nerd-uncensored-MFANN-slerp
c70d4e9569bcde272d74d80e742f7a46ec6d37fc
4.248121
0
7.616
false
false
false
true
0.798612
0.156447
15.644712
0.292011
1.755723
0
0
0.260067
1.342282
0.379177
5.630469
0.11004
1.115544
false
false
2025-01-25
2025-01-25
1
netcat420/Qwen2.5-7b-nerd-uncensored-MFANN-slerp (Merge)
netcat420_Qwen2.5-Coder-Scholar-7B-Abliterated-MFANN_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/Qwen2.5-Coder-Scholar-7B-Abliterated-MFANN" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/Qwen2.5-Coder-Scholar-7B-Abliterated-MFANN</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__Qwen2.5-Coder-Scholar-7B-Abliterated-MFANN-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/Qwen2.5-Coder-Scholar-7B-Abliterated-MFANN
f2fdf326b731948216853ada912b94cd0bc71fb9
22.19936
apache-2.0
1
7.616
true
false
false
true
0.655189
0.574227
57.422749
0.507145
29.98075
0.064955
6.495468
0.292785
5.704698
0.405844
9.630469
0.315658
23.962027
false
false
2024-12-31
2025-01-01
0
netcat420/Qwen2.5-Coder-Scholar-7B-Abliterated-MFANN
netcat420_Qwen2.5-Coder-Scholar-7B-Abliterated-MFANN-Slerp-Unretrained_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/Qwen2.5-Coder-Scholar-7B-Abliterated-MFANN-Slerp-Unretrained" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/Qwen2.5-Coder-Scholar-7B-Abliterated-MFANN-Slerp-Unretrained</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__Qwen2.5-Coder-Scholar-7B-Abliterated-MFANN-Slerp-Unretrained-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/Qwen2.5-Coder-Scholar-7B-Abliterated-MFANN-Slerp-Unretrained
f3e8300e7948b878564f1f4de1d98fc03cc18e32
25.427966
1
7.616
false
false
false
true
0.648431
0.648641
64.864116
0.506557
29.939053
0.141239
14.123867
0.298658
6.487696
0.415208
10.134375
0.343168
27.018691
false
false
2025-01-18
2025-01-19
1
netcat420/Qwen2.5-Coder-Scholar-7B-Abliterated-MFANN-Slerp-Unretrained (Merge)
netcat420_Qwen2.5-DeepSeek-R1-MFANN-Slerp-7b_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/Qwen2.5-DeepSeek-R1-MFANN-Slerp-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/Qwen2.5-DeepSeek-R1-MFANN-Slerp-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__Qwen2.5-DeepSeek-R1-MFANN-Slerp-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/Qwen2.5-DeepSeek-R1-MFANN-Slerp-7b
5d4016402d161c445c5a5982e5783c97bd37d3b2
8.365737
1
7.616
false
false
false
true
0.70255
0.267556
26.755564
0.378902
13.455601
0.002266
0.226586
0.232383
0
0.352792
2.232292
0.167719
7.524379
false
false
2025-01-23
2025-01-23
1
netcat420/Qwen2.5-DeepSeek-R1-MFANN-Slerp-7b (Merge)
netcat420_Qwen2.5-MFANN-7b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/Qwen2.5-MFANN-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/Qwen2.5-MFANN-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__Qwen2.5-MFANN-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/Qwen2.5-MFANN-7b
30757ed2483d24b161febb79bb8f6485bba6cb20
23.415945
1
7.616
false
false
false
true
0.68481
0.609723
60.972331
0.505435
30.29029
0.112538
11.253776
0.286074
4.809843
0.402063
8.357813
0.323305
24.811613
false
false
2025-01-24
2025-01-24
0
netcat420/Qwen2.5-MFANN-7b
netcat420_qwen2.5-MFANN-7b-SLERPv1.1_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/qwen2.5-MFANN-7b-SLERPv1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/qwen2.5-MFANN-7b-SLERPv1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__qwen2.5-MFANN-7b-SLERPv1.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/qwen2.5-MFANN-7b-SLERPv1.1
62c8ce3c441cc8c0f7aa89189d008689c39935f9
25.69121
0
7.616
false
false
false
true
1.305276
0.655485
65.548522
0.507476
29.976912
0.159366
15.936556
0.290268
5.369128
0.412635
10.11276
0.34483
27.203384
false
false
2025-02-03
2025-02-03
1
netcat420/qwen2.5-MFANN-7b-SLERPv1.1 (Merge)
netease-youdao_Confucius-o1-14B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/netease-youdao/Confucius-o1-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netease-youdao/Confucius-o1-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netease-youdao__Confucius-o1-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netease-youdao/Confucius-o1-14B
e6c64e53adbcbbdff9e2114b4f61bd4f2aa1602c
32.246031
apache-2.0
33
14.77
true
false
false
true
1.891638
0.63785
63.784979
0.629977
47.345233
0.054381
5.438066
0.364933
15.324385
0.433813
14.193229
0.526513
47.390293
false
false
2025-01-20
2025-01-27
2
Qwen/Qwen2.5-14B
newsbang_Homer-7B-v0.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/newsbang/Homer-7B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">newsbang/Homer-7B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/newsbang__Homer-7B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
newsbang/Homer-7B-v0.1
c953cc313ef5e5029efd057c0d3809a3b8d1cf9f
31.333866
apache-2.0
0
7.616
true
false
false
false
0.690734
0.610872
61.087249
0.560139
37.309227
0.282477
28.247734
0.324664
9.955257
0.435698
12.795573
0.447473
38.608156
false
false
2024-11-14
2024-11-14
0
newsbang/Homer-7B-v0.1
newsbang_Homer-7B-v0.2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/newsbang/Homer-7B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">newsbang/Homer-7B-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/newsbang__Homer-7B-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
newsbang/Homer-7B-v0.2
50b4ca941657ed362f5660aed8274a59a6b3fe2d
33.114663
0
7.616
false
false
false
true
0.674598
0.749383
74.938275
0.551733
36.403486
0.253776
25.377644
0.332215
10.961969
0.42975
13.11875
0.440991
37.887855
false
false
2024-11-15
0
Removed
newsbang_Homer-v0.3-Qwen2.5-7B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/newsbang/Homer-v0.3-Qwen2.5-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">newsbang/Homer-v0.3-Qwen2.5-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/newsbang__Homer-v0.3-Qwen2.5-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
newsbang/Homer-v0.3-Qwen2.5-7B
4fa38c6c590d8e9bbf2075b2fa9cc37e75cde5d4
31.088203
0
7.616
false
false
false
true
0.585603
0.515401
51.540136
0.548059
36.413677
0.295317
29.531722
0.333893
11.185682
0.474365
19.46224
0.445562
38.395759
false
false
2024-11-18
0
Removed
newsbang_Homer-v0.4-Qwen2.5-7B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/newsbang/Homer-v0.4-Qwen2.5-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">newsbang/Homer-v0.4-Qwen2.5-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/newsbang__Homer-v0.4-Qwen2.5-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
newsbang/Homer-v0.4-Qwen2.5-7B
e5b73b06e63de7f77845463f8a11c93e82befd15
33.918837
0
7.616
false
false
false
true
0.63972
0.799941
79.994082
0.55331
36.603703
0.276435
27.643505
0.315436
8.724832
0.431083
13.185417
0.436253
37.36148
false
false
2024-11-18
0
Removed
newsbang_Homer-v0.5-Qwen2.5-7B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/newsbang/Homer-v0.5-Qwen2.5-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">newsbang/Homer-v0.5-Qwen2.5-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/newsbang__Homer-v0.5-Qwen2.5-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> has been flagged! <a target="_blank" href="https://huggingface.co/spaces/open-llm-leaderboard/open_llm_leaderboard/discussions/1022" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">See discussion #1022</a>
newsbang/Homer-v0.5-Qwen2.5-7B
9dc7090b2226f9a2217f593518f734e3246001f9
34.600199
0
7.616
false
false
true
true
0.672584
0.788076
78.807564
0.554018
36.678089
0.362538
36.253776
0.302852
7.04698
0.419302
11.379427
0.436918
37.435358
true
false
2024-11-20
0
Removed
newsbang_Homer-v1.0-Qwen2.5-72B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/newsbang/Homer-v1.0-Qwen2.5-72B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">newsbang/Homer-v1.0-Qwen2.5-72B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/newsbang__Homer-v1.0-Qwen2.5-72B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
newsbang/Homer-v1.0-Qwen2.5-72B
c7f3c5c131c046626f8d33eb615c1a0aba19998b
47.351083
apache-2.0
6
72.706
true
false
false
false
14.774429
0.762772
76.277167
0.73098
62.274065
0.483384
48.338369
0.416107
22.147651
0.467729
17.899479
0.614528
57.16977
false
false
2024-12-16
2024-12-16
0
newsbang/Homer-v1.0-Qwen2.5-72B
newsbang_Homer-v1.0-Qwen2.5-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/newsbang/Homer-v1.0-Qwen2.5-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">newsbang/Homer-v1.0-Qwen2.5-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/newsbang__Homer-v1.0-Qwen2.5-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
newsbang/Homer-v1.0-Qwen2.5-7B
4795825dff1b68dd2cc02b3bd39598a161c09c66
32.145228
apache-2.0
2
7.616
true
false
false
false
0.639252
0.639274
63.927379
0.565525
37.810847
0.303625
30.362538
0.322148
9.619687
0.427823
11.877865
0.453457
39.27305
false
false
2024-12-04
2024-12-04
0
newsbang/Homer-v1.0-Qwen2.5-7B
nguyentd_FinancialAdvice-Qwen2.5-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/nguyentd/FinancialAdvice-Qwen2.5-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nguyentd/FinancialAdvice-Qwen2.5-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nguyentd__FinancialAdvice-Qwen2.5-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nguyentd/FinancialAdvice-Qwen2.5-7B
5c3421d5a980d0b2365b0d704ead30c9e534a019
20.935465
apache-2.0
1
7.616
true
false
false
false
0.654445
0.449606
44.960593
0.473093
25.630436
0.093656
9.365559
0.294463
5.928412
0.40249
9.144531
0.375249
30.583259
false
false
2024-10-21
2024-11-18
1
nguyentd/FinancialAdvice-Qwen2.5-7B (Merge)
ngxson_MiniThinky-1B-Llama-3.2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ngxson/MiniThinky-1B-Llama-3.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ngxson/MiniThinky-1B-Llama-3.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ngxson__MiniThinky-1B-Llama-3.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ngxson/MiniThinky-1B-Llama-3.2
a5e5adf4f7e63f7127a72def90ba3a627bae36bf
6.421003
4
1.236
false
false
false
true
0.747681
0.277148
27.714797
0.314227
4.347795
0.026435
2.643505
0.239094
0
0.343365
2.18724
0.114694
1.632683
false
false
2025-01-06
2025-01-07
1
ngxson/MiniThinky-1B-Llama-3.2 (Merge)
ngxson_MiniThinky-v2-1B-Llama-3.2_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ngxson/MiniThinky-v2-1B-Llama-3.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ngxson/MiniThinky-v2-1B-Llama-3.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ngxson__MiniThinky-v2-1B-Llama-3.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ngxson/MiniThinky-v2-1B-Llama-3.2
0eb811aca13439292d4151456577a527a2982c46
6.374444
38
1.236
false
false
false
true
0.731149
0.296307
29.630713
0.320511
4.893769
0.018127
1.812689
0.239933
0
0.335615
0.61849
0.111619
1.291002
false
false
2025-01-08
2025-01-09
1
ngxson/MiniThinky-v2-1B-Llama-3.2 (Merge)
nhyha_N3N_Delirium-v1_1030_0227_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/nhyha/N3N_Delirium-v1_1030_0227" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nhyha/N3N_Delirium-v1_1030_0227</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nhyha__N3N_Delirium-v1_1030_0227-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nhyha/N3N_Delirium-v1_1030_0227
41eabc719bd611e2bd0094b0842df84916a57a46
31.14332
apache-2.0
0
10.159
true
false
false
true
2.131856
0.802289
80.228904
0.589069
40.77504
0.093656
9.365559
0.337248
11.63311
0.409812
9.859896
0.414977
34.997414
false
false
2024-10-30
2024-11-04
2
unsloth/gemma-2-9b-it
nhyha_N3N_Llama-3.1-8B-Instruct_1028_0216_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/nhyha/N3N_Llama-3.1-8B-Instruct_1028_0216" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nhyha/N3N_Llama-3.1-8B-Instruct_1028_0216</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nhyha__N3N_Llama-3.1-8B-Instruct_1028_0216-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nhyha/N3N_Llama-3.1-8B-Instruct_1028_0216
d0715a631898112c9c3b729d0334588a2ff636d8
23.403604
apache-2.0
0
8.03
true
false
false
false
0.749506
0.478083
47.80826
0.505374
28.980464
0.167674
16.767372
0.306208
7.494407
0.405031
10.06224
0.36378
29.30888
false
false
2024-10-28
2024-11-04
2
meta-llama/Meta-Llama-3.1-8B
nhyha_N3N_gemma-2-9b-it_20241029_1532_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/nhyha/N3N_gemma-2-9b-it_20241029_1532" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nhyha/N3N_gemma-2-9b-it_20241029_1532</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nhyha__N3N_gemma-2-9b-it_20241029_1532-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nhyha/N3N_gemma-2-9b-it_20241029_1532
6cfc55a717961ef206978b577bd74df97efe1455
32.022249
gemma
2
10.159
true
false
false
false
2.394044
0.675194
67.519404
0.586312
40.986668
0.204683
20.468278
0.340604
12.080537
0.459354
16.385938
0.412234
34.692671
false
false
2024-10-29
2024-11-04
1
unsloth/gemma-2-9b-it
nhyha_N3N_gemma-2-9b-it_20241110_2026_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/nhyha/N3N_gemma-2-9b-it_20241110_2026" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nhyha/N3N_gemma-2-9b-it_20241110_2026</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nhyha__N3N_gemma-2-9b-it_20241110_2026-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nhyha/N3N_gemma-2-9b-it_20241110_2026
2d4c24278ed9d8b42a4035da16a5aea745797441
28.741941
gemma
0
10.159
true
false
false
true
2.54055
0.628283
62.828296
0.586715
40.944106
0.138218
13.821752
0.336409
11.521253
0.407302
9.779427
0.402011
33.556811
false
false
2024-11-12
2024-11-12
1
unsloth/gemma-2-9b-it
nhyha_merge_Qwen2.5-7B-Instruct_20241023_0314_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/nhyha/merge_Qwen2.5-7B-Instruct_20241023_0314" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nhyha/merge_Qwen2.5-7B-Instruct_20241023_0314</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nhyha__merge_Qwen2.5-7B-Instruct_20241023_0314-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nhyha/merge_Qwen2.5-7B-Instruct_20241023_0314
4d93f65c1f870556f05c77a1ef4f26819d49daf7
29.208865
apache-2.0
0
7.616
true
false
false
false
0.695701
0.569457
56.945682
0.555853
36.365185
0.219789
21.978852
0.321309
9.50783
0.425062
11.099479
0.454205
39.356161
false
false
2024-10-23
2024-11-04
3
Qwen/Qwen2.5-7B
nidum_Nidum-Limitless-Gemma-2B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/nidum/Nidum-Limitless-Gemma-2B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nidum/Nidum-Limitless-Gemma-2B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nidum__Nidum-Limitless-Gemma-2B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nidum/Nidum-Limitless-Gemma-2B
e209e3513d2b34c0e6c433ede26e17604c25cb1a
5.939422
apache-2.0
4
2.506
true
false
false
true
0.396814
0.242351
24.235141
0.30788
3.45106
0
0
0.264262
1.901566
0.374031
4.120573
0.117354
1.928191
false
false
2024-08-02
2024-08-07
0
nidum/Nidum-Limitless-Gemma-2B
nisten_franqwenstein-35b_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/nisten/franqwenstein-35b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nisten/franqwenstein-35b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nisten__franqwenstein-35b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nisten/franqwenstein-35b
7180aa73e82945a1d2ae0eb304508e21d57e4c27
35.941926
mit
8
34.714
true
false
false
false
5.01777
0.379863
37.986321
0.664658
52.227468
0.30287
30.287009
0.403523
20.469799
0.494021
22.119271
0.573055
52.561687
false
false
2024-10-03
2024-10-03
1
nisten/franqwenstein-35b (Merge)
nisten_franqwenstein-35b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/nisten/franqwenstein-35b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nisten/franqwenstein-35b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nisten__franqwenstein-35b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nisten/franqwenstein-35b
901351a987d664a1cd7f483115a167d3ae5694ec
34.451117
mit
8
34.714
true
false
false
true
6.328604
0.391354
39.135383
0.659113
51.680277
0.304381
30.438066
0.35906
14.541387
0.468104
19.679688
0.561087
51.2319
false
false
2024-10-03
2024-10-03
1
nisten/franqwenstein-35b (Merge)
nisten_tqwendo-36b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/nisten/tqwendo-36b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nisten/tqwendo-36b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nisten__tqwendo-36b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nisten/tqwendo-36b
c50f38e8421785af4b8596f81e0098a6585b4f05
36.676665
mit
8
35.69
true
false
false
false
9.150391
0.677767
67.776721
0.643183
49.414936
0.393505
39.350453
0.331376
10.850112
0.442958
15.103125
0.438082
37.564642
false
false
2024-12-21
2024-12-21
1
nisten/tqwendo-36b (Merge)
nlpguy_Lion-Lamarck-v.1.0.8_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/nlpguy/Lion-Lamarck-v.1.0.8" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nlpguy/Lion-Lamarck-v.1.0.8</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nlpguy__Lion-Lamarck-v.1.0.8-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nlpguy/Lion-Lamarck-v.1.0.8
3f1c2632893f4a7d22ab50a1b87ebee9f054086f
33.491283
2
14.766
false
false
false
true
2.028809
0.450905
45.090471
0.586893
40.848441
0.410876
41.087613
0.358221
14.42953
0.467271
19.008854
0.464345
40.482787
false
false
2025-01-27
2025-01-27
1
nlpguy/Lion-Lamarck-v.1.0.8 (Merge)
nlpguy_Lion-Lamarck-v.1.0.9_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/nlpguy/Lion-Lamarck-v.1.0.9" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nlpguy/Lion-Lamarck-v.1.0.9</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nlpguy__Lion-Lamarck-v.1.0.9-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nlpguy/Lion-Lamarck-v.1.0.9
2bb3b70d5eab9fbc39a10452c601fe36d00b9fca
33.860542
1
14.766
false
false
false
false
1.953311
0.340895
34.089549
0.591824
40.468848
0.413897
41.389728
0.390101
18.680089
0.529958
27.378125
0.470412
41.156915
false
false
2025-01-28
2025-01-28
1
nlpguy/Lion-Lamarck-v.1.0.9 (Merge)
nlpguy_Lion-Lamarck-v.1.1.0_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/nlpguy/Lion-Lamarck-v.1.1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nlpguy/Lion-Lamarck-v.1.1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nlpguy__Lion-Lamarck-v.1.1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nlpguy/Lion-Lamarck-v.1.1.0
dedb68b932cb4bbf50d80150419fdca664ba63e5
34.249639
1
14.766
false
false
false
false
1.943976
0.365775
36.577503
0.596246
41.166304
0.40861
40.861027
0.392617
19.01566
0.532531
27.533073
0.463098
40.344267
false
false
2025-01-29
2025-01-30
1
nlpguy/Lion-Lamarck-v.1.1.0 (Merge)
nlpguy_Mistral-NeMo-Minitron-Upscale-v1_bfloat16
bfloat16
🟢 pretrained
🟢
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/nlpguy/Mistral-NeMo-Minitron-Upscale-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nlpguy/Mistral-NeMo-Minitron-Upscale-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nlpguy__Mistral-NeMo-Minitron-Upscale-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nlpguy/Mistral-NeMo-Minitron-Upscale-v1
9e6d747cbb81e1f25915a0f42802cbeb85b61c3e
10.87693
other
0
12.451
true
false
false
false
2.934306
0.16484
16.48404
0.4468
22.06891
0.007553
0.755287
0.280201
4.026846
0.380354
4.844271
0.25374
17.082225
true
false
2024-09-29
2024-09-29
1
nlpguy/Mistral-NeMo-Minitron-Upscale-v1 (Merge)
nlpguy_Mistral-NeMo-Minitron-Upscale-v2_bfloat16
bfloat16
🟢 pretrained
🟢
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/nlpguy/Mistral-NeMo-Minitron-Upscale-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nlpguy/Mistral-NeMo-Minitron-Upscale-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nlpguy__Mistral-NeMo-Minitron-Upscale-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nlpguy/Mistral-NeMo-Minitron-Upscale-v2
4ac077e496705687fdcbe51f3b915be42e91bf79
8.232151
other
0
12.451
true
false
false
false
2.924619
0.157272
15.727159
0.394967
14.382673
0.006042
0.60423
0.27349
3.131991
0.379083
5.252083
0.192653
10.29477
true
false
2024-09-29
2024-09-29
1
nlpguy/Mistral-NeMo-Minitron-Upscale-v2 (Merge)
nlpguy_Mistral-NeMo-Minitron-Upscale-v3_bfloat16
bfloat16
🟢 pretrained
🟢
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/nlpguy/Mistral-NeMo-Minitron-Upscale-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nlpguy/Mistral-NeMo-Minitron-Upscale-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nlpguy__Mistral-NeMo-Minitron-Upscale-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nlpguy/Mistral-NeMo-Minitron-Upscale-v3
6703b09d3d78cc020448ee93c53dc727312bcbaf
5.013437
other
1
12.451
true
false
false
false
6.044669
0.14121
14.120977
0.305245
3.398266
0
0
0.259228
1.230425
0.409844
9.430469
0.117104
1.900488
true
false
2024-10-04
2024-10-04
1
nlpguy/Mistral-NeMo-Minitron-Upscale-v3 (Merge)
nlpguy_StableProse_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/nlpguy/StableProse" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nlpguy/StableProse</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nlpguy__StableProse-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nlpguy/StableProse
4937dc747684705e4b87df27b47eab5429f3a9c1
16.422495
1
12.248
false
false
false
false
1.794363
0.197239
19.723888
0.511656
30.180203
0.05287
5.287009
0.302852
7.04698
0.406708
8.871875
0.346825
27.425015
false
false
2024-08-16
2024-08-17
1
nlpguy/StableProse (Merge)
nlpguy_StarFusion-alpha1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/nlpguy/StarFusion-alpha1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nlpguy/StarFusion-alpha1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nlpguy__StarFusion-alpha1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nlpguy/StarFusion-alpha1
dccad965a710d7bee001b6387c8307e7c320291e
20.840912
apache-2.0
1
7.242
true
false
false
true
1.174406
0.566009
56.60093
0.442869
21.933182
0.072508
7.250755
0.295302
6.040268
0.408104
8.879688
0.319066
24.340647
true
false
2024-04-13
2024-06-26
1
nlpguy/StarFusion-alpha1 (Merge)
noname0202_Llama-3.2-4x3B-Instruct_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/noname0202/Llama-3.2-4x3B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">noname0202/Llama-3.2-4x3B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/noname0202__Llama-3.2-4x3B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
noname0202/Llama-3.2-4x3B-Instruct
b7db5c4ec1138be364127e0482adabc8355d0943
23.972363
0
9.949
false
false
false
true
1.192251
0.706718
70.671817
0.464731
24.689909
0.156344
15.634441
0.272651
3.020134
0.367396
4.424479
0.328541
25.393395
false
false
2025-01-26
0
Removed
noname0202_gemma-2-2b-it-ties_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/noname0202/gemma-2-2b-it-ties" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">noname0202/gemma-2-2b-it-ties</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/noname0202__gemma-2-2b-it-ties-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
noname0202/gemma-2-2b-it-ties
7ab51f4991186f6850d826e4ddc44a053de05f2f
9.68618
0
2.614
false
false
false
true
1.202698
0.126571
12.657083
0.420574
18.139569
0.001511
0.151057
0.270134
2.684564
0.392885
7.14401
0.256067
17.340795
false
false
2025-01-29
0
Removed
noname0202_gemma-2-9b-sft-jp-en-zh-v1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/noname0202/gemma-2-9b-sft-jp-en-zh-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">noname0202/gemma-2-9b-sft-jp-en-zh-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/noname0202__gemma-2-9b-sft-jp-en-zh-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
noname0202/gemma-2-9b-sft-jp-en-zh-v1
06ca7f00ce3ddece15cb50a1292ce0912e19af4e
15.807153
0
9.242
false
false
false
true
0.915736
0.298805
29.880495
0.451929
22.00024
0.026435
2.643505
0.307047
7.606264
0.40801
9.101302
0.3125
23.611111
false
false
2025-01-05
0
Removed
noname0202_gemma-2-9b-sft-jp-en-zh-v2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/noname0202/gemma-2-9b-sft-jp-en-zh-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">noname0202/gemma-2-9b-sft-jp-en-zh-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/noname0202__gemma-2-9b-sft-jp-en-zh-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
noname0202/gemma-2-9b-sft-jp-en-zh-v2
b32a4038b8617c7620ee7761609d926ddda8c1fe
17.821656
0
9.242
false
false
false
true
0.812772
0.399347
39.934707
0.451504
22.658059
0.02568
2.567976
0.287752
5.033557
0.361156
7.011198
0.36752
29.724439
false
false
2025-01-05
0
Removed
noname0202_llama-math-1b-r16-0to512tokens-test_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/noname0202/llama-math-1b-r16-0to512tokens-test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">noname0202/llama-math-1b-r16-0to512tokens-test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/noname0202__llama-math-1b-r16-0to512tokens-test-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
noname0202/llama-math-1b-r16-0to512tokens-test
df274b741781f3f3ecce2ef86883863aaeb71c58
13.55231
apache-2.0
0
1.236
true
false
false
true
0.366534
0.546975
54.697536
0.348842
8.389239
0.066465
6.646526
0.266779
2.237136
0.314313
1.255729
0.172789
8.087692
false
false
2025-01-24
2025-01-25
0
noname0202/llama-math-1b-r16-0to512tokens-test
noname0202_llama-math-1b-r32-0to512tokens-test_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/noname0202/llama-math-1b-r32-0to512tokens-test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">noname0202/llama-math-1b-r32-0to512tokens-test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/noname0202__llama-math-1b-r32-0to512tokens-test-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
noname0202/llama-math-1b-r32-0to512tokens-test
200ae9a8db697345c6471e12a225f2e7adb953c1
13.981026
apache-2.0
0
1.236
true
false
false
true
0.400464
0.568258
56.825778
0.349518
8.1919
0.067221
6.722054
0.265101
2.013423
0.320948
1.685156
0.176031
8.447843
false
false
2025-01-24
2025-01-24
0
noname0202/llama-math-1b-r32-0to512tokens-test
noname0202_llama-math-1b-r32-test_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/noname0202/llama-math-1b-r32-test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">noname0202/llama-math-1b-r32-test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/noname0202__llama-math-1b-r32-test-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
noname0202/llama-math-1b-r32-test
3b2cd2f41ed1a9894dd1bdd275f45081d5c6caf1
14.254482
apache-2.0
0
1.236
true
false
false
true
0.373398
0.581922
58.192152
0.348596
8.498755
0.062689
6.268882
0.261745
1.565996
0.315646
2.322396
0.178108
8.678709
false
false
2025-01-24
2025-01-24
0
noname0202/llama-math-1b-r32-test
noname0202_llama-math-1b-r8-512tokens-test_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/noname0202/llama-math-1b-r8-512tokens-test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">noname0202/llama-math-1b-r8-512tokens-test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/noname0202__llama-math-1b-r8-512tokens-test-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
noname0202/llama-math-1b-r8-512tokens-test
ba83c9aca75dd38a66df90eb2dd1cb56db6d3c9a
14.328637
apache-2.0
0
1.236
true
false
false
true
0.36586
0.579199
57.919875
0.349576
8.396798
0.063444
6.344411
0.268456
2.46085
0.316948
2.485156
0.175283
8.364731
false
false
2025-01-24
2025-01-24
0
noname0202/llama-math-1b-r8-512tokens-test
notbdq_Qwen2.5-14B-Instruct-1M-GRPO-Reasoning_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/notbdq/Qwen2.5-14B-Instruct-1M-GRPO-Reasoning" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">notbdq/Qwen2.5-14B-Instruct-1M-GRPO-Reasoning</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/notbdq__Qwen2.5-14B-Instruct-1M-GRPO-Reasoning-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
notbdq/Qwen2.5-14B-Instruct-1M-GRPO-Reasoning
8322cabcc11053dca1f6fac6f3ffac4781ec9641
32.835462
1
14.77
false
false
false
true
1.695876
0.841356
84.135649
0.619822
45.658281
0.006798
0.679758
0.343121
12.416107
0.418
11.35
0.484957
42.772976
false
false
2025-02-01
2025-02-05
0
notbdq/Qwen2.5-14B-Instruct-1M-GRPO-Reasoning
nothingiisreal_L3.1-8B-Celeste-V1.5_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/nothingiisreal/L3.1-8B-Celeste-V1.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nothingiisreal/L3.1-8B-Celeste-V1.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nothingiisreal__L3.1-8B-Celeste-V1.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nothingiisreal/L3.1-8B-Celeste-V1.5
e7ea0e3d2727c8cf66c0481ffa251f28cb85429f
26.121793
llama3.1
38
8.03
true
false
false
true
0.707164
0.732672
73.267153
0.50118
28.887967
0.143505
14.350453
0.284396
4.58613
0.374865
5.591406
0.370429
30.047651
false
false
2024-07-27
2024-12-04
0
nothingiisreal/L3.1-8B-Celeste-V1.5
nothingiisreal_MN-12B-Starcannon-v2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/nothingiisreal/MN-12B-Starcannon-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nothingiisreal/MN-12B-Starcannon-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nothingiisreal__MN-12B-Starcannon-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nothingiisreal/MN-12B-Starcannon-v2
f2ff756e8c32d9107d4f6a3c18c730e3fe0cae88
18.030393
apache-2.0
6
12.248
true
false
false
true
1.722663
0.392527
39.252738
0.50045
28.424783
0.050604
5.060423
0.278523
3.803132
0.397812
7.993229
0.312832
23.64805
true
false
2024-08-13
2024-09-03
1
nothingiisreal/MN-12B-Starcannon-v2 (Merge)
nothingiisreal_MN-12B-Starcannon-v3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/nothingiisreal/MN-12B-Starcannon-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nothingiisreal/MN-12B-Starcannon-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nothingiisreal__MN-12B-Starcannon-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nothingiisreal/MN-12B-Starcannon-v3
169480b62121c4f070e93a05158545c679712644
18.993414
12
12.248
false
false
false
true
1.745671
0.380738
38.073755
0.517055
30.873002
0.068731
6.873112
0.27349
3.131991
0.404635
9.846094
0.326463
25.16253
false
false
2024-08-13
2024-09-03
1
nothingiisreal/MN-12B-Starcannon-v3 (Merge)
nvidia_AceInstruct-1.5B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/nvidia/AceInstruct-1.5B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nvidia/AceInstruct-1.5B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nvidia__AceInstruct-1.5B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nvidia/AceInstruct-1.5B
1e3d02075fcf988407b436eb5c10a407be86c71f
12.904384
cc-by-nc-4.0
11
1.777
true
false
false
true
0.853204
0.394776
39.477586
0.393196
15.468561
0
0
0.271812
2.908277
0.346
2.083333
0.257397
17.488549
false
true
2025-01-15
2025-01-24
0
nvidia/AceInstruct-1.5B
nvidia_AceInstruct-72B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/nvidia/AceInstruct-72B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nvidia/AceInstruct-72B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nvidia__AceInstruct-72B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nvidia/AceInstruct-72B
3c5c88ea8ab5d5067e23a482cc26014a0d23e848
30.66182
cc-by-nc-4.0
11
72.706
true
false
false
true
40.393347
0.711889
71.18889
0.613904
44.20382
0.041541
4.154079
0.321309
9.50783
0.420604
11.875521
0.487367
43.04078
false
true
2025-01-15
2025-01-24
0
nvidia/AceInstruct-72B
nvidia_AceInstruct-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/nvidia/AceInstruct-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nvidia/AceInstruct-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nvidia__AceInstruct-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nvidia/AceInstruct-7B
3bbb14f63afd2dc890c7932bfffb4f6dc3bfa1e8
24.295214
cc-by-nc-4.0
5
7.616
true
false
false
true
0.925072
0.542229
54.222906
0.550118
36.574814
0.003776
0.377644
0.307047
7.606264
0.4255
11.6875
0.417719
35.302157
false
true
2025-01-15
2025-01-24
0
nvidia/AceInstruct-7B
nvidia_AceMath-1.5B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/nvidia/AceMath-1.5B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nvidia/AceMath-1.5B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nvidia__AceMath-1.5B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nvidia/AceMath-1.5B-Instruct
166818c371eaafb212b243aecadd50b1079fa776
11.378178
cc-by-nc-4.0
4
1.777
true
false
false
true
0.831622
0.321237
32.123654
0.40243
16.76251
0
0
0.274329
3.243848
0.360698
4.320573
0.206366
11.818484
false
true
2025-01-13
2025-01-24
0
nvidia/AceMath-1.5B-Instruct
nvidia_AceMath-72B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/nvidia/AceMath-72B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nvidia/AceMath-72B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nvidia__AceMath-72B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nvidia/AceMath-72B-Instruct
9bab369176cddd6cbc38b2002ffbef9a3152aade
24.759834
cc-by-nc-4.0
6
72.706
true
false
false
true
44.005618
0.494993
49.499328
0.640216
48.687772
0.000755
0.075529
0.270973
2.796421
0.406156
9.602865
0.441074
37.897089
false
true
2025-01-14
2025-01-24
0
nvidia/AceMath-72B-Instruct
nvidia_AceMath-72B-RM_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForSequenceClassification
<a target="_blank" href="https://huggingface.co/nvidia/AceMath-72B-RM" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nvidia/AceMath-72B-RM</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nvidia__AceMath-72B-RM-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nvidia/AceMath-72B-RM
bb8cb2e7bd45c1d74894d87a95249d5dd5c19bf4
3.428827
cc-by-nc-4.0
6
71.461
true
false
false
true
75.000362
0.14126
14.125964
0.271743
1.403502
0
0
0.23406
0
0.335146
3.059896
0.117852
1.983599
false
true
2025-01-14
2025-01-24
0
nvidia/AceMath-72B-RM
nvidia_AceMath-7B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/nvidia/AceMath-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nvidia/AceMath-7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nvidia__AceMath-7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nvidia/AceMath-7B-Instruct
f29b4bd5ad5e4fc7bfb52343dca2dd07e948f964
19.76606
cc-by-nc-4.0
9
7.616
true
false
false
true
0.988654
0.453178
45.317757
0.499385
29.993826
0
0
0.291946
5.592841
0.419271
11.208854
0.338348
26.483082
false
true
2025-01-13
2025-01-24
0
nvidia/AceMath-7B-Instruct
nvidia_AceMath-7B-RM_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForSequenceClassification
<a target="_blank" href="https://huggingface.co/nvidia/AceMath-7B-RM" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nvidia/AceMath-7B-RM</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nvidia__AceMath-7B-RM-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nvidia/AceMath-7B-RM
2a7b81019f94d1a78eec298f7cf5c677ff958f5a
3.22439
cc-by-nc-4.0
5
7.071
true
false
false
true
0.67714
0.149378
14.937809
0.242269
0.251527
0
0
0.245805
0
0.358
2.616667
0.113863
1.540337
false
true
2025-01-14
2025-01-24
0
nvidia/AceMath-7B-RM
nvidia_Hymba-1.5B-Base_bfloat16
bfloat16
🟢 pretrained
🟢
Original
HymbaForCausalLM
<a target="_blank" href="https://huggingface.co/nvidia/Hymba-1.5B-Base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nvidia/Hymba-1.5B-Base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nvidia__Hymba-1.5B-Base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nvidia/Hymba-1.5B-Base
85e5b833d75f26170c7684ba83140f1bf9fedf37
7.921989
other
138
1.523
true
false
false
false
9.107914
0.229512
22.951214
0.325648
7.689941
0.006798
0.679758
0.255872
0.782998
0.356635
5.179427
0.192237
10.248596
false
true
2024-10-09
2024-12-06
0
nvidia/Hymba-1.5B-Base
nvidia_Hymba-1.5B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
HymbaForCausalLM
<a target="_blank" href="https://huggingface.co/nvidia/Hymba-1.5B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nvidia/Hymba-1.5B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nvidia__Hymba-1.5B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nvidia/Hymba-1.5B-Instruct
ffc758eefef247c0ee4d7ce41636562759027ce6
13.739211
other
222
1.523
true
false
false
true
6.712666
0.600906
60.09056
0.306713
4.591464
0
0
0.288591
5.145414
0.331583
1.047917
0.204039
11.559914
false
true
2024-10-31
2024-12-06
1
nvidia/Hymba-1.5B-Instruct (Merge)
nvidia_Llama-3.1-Minitron-4B-Depth-Base_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/nvidia/Llama-3.1-Minitron-4B-Depth-Base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nvidia/Llama-3.1-Minitron-4B-Depth-Base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nvidia__Llama-3.1-Minitron-4B-Depth-Base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nvidia/Llama-3.1-Minitron-4B-Depth-Base
40d82bc951b4f39e9c9e11176334250c30975098
11.53217
other
20
4.02
true
false
false
false
0.467691
0.160694
16.069363
0.41707
19.44411
0.012085
1.208459
0.263423
1.789709
0.401063
10.699479
0.279837
19.9819
false
true
2024-08-13
2024-09-25
0
nvidia/Llama-3.1-Minitron-4B-Depth-Base
nvidia_Llama-3.1-Nemotron-70B-Instruct-HF_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/nvidia/Llama-3.1-Nemotron-70B-Instruct-HF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nvidia/Llama-3.1-Nemotron-70B-Instruct-HF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nvidia__Llama-3.1-Nemotron-70B-Instruct-HF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nvidia/Llama-3.1-Nemotron-70B-Instruct-HF
250db5cf2323e04a6d2025a2ca2b94a95c439e88
34.578372
llama3.1
2,014
70.554
true
false
false
true
13.628748
0.738067
73.806722
0.6316
47.10953
0.287009
28.700906
0.258389
1.118568
0.43276
13.195052
0.491855
43.53945
false
true
2024-10-12
2024-10-16
2
meta-llama/Meta-Llama-3.1-70B
nvidia_Minitron-4B-Base_bfloat16
bfloat16
🟢 pretrained
🟢
Original
NemotronForCausalLM
<a target="_blank" href="https://huggingface.co/nvidia/Minitron-4B-Base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nvidia/Minitron-4B-Base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nvidia__Minitron-4B-Base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nvidia/Minitron-4B-Base
d6321f64412982046a32d761701167e752fedc02
11.939973
other
129
4
true
false
false
false
1.189267
0.221794
22.179373
0.408388
17.215601
0.017372
1.73716
0.269295
2.572707
0.413375
9.938542
0.261968
17.996454
false
true
2024-07-19
2024-09-25
0
nvidia/Minitron-4B-Base
nvidia_Minitron-8B-Base_bfloat16
bfloat16
🟢 pretrained
🟢
Original
NemotronForCausalLM
<a target="_blank" href="https://huggingface.co/nvidia/Minitron-8B-Base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nvidia/Minitron-8B-Base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nvidia__Minitron-8B-Base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nvidia/Minitron-8B-Base
70fa5997afc42807f41eebd5d481f040556fdf97
14.178726
other
63
7.22
true
false
false
false
1.412521
0.242427
24.242676
0.439506
22.040793
0.023414
2.34139
0.27349
3.131991
0.402552
9.085677
0.318068
24.229832
false
true
2024-07-19
2024-09-25
0
nvidia/Minitron-8B-Base
nvidia_Mistral-NeMo-Minitron-8B-Base_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/nvidia/Mistral-NeMo-Minitron-8B-Base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nvidia/Mistral-NeMo-Minitron-8B-Base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nvidia__Mistral-NeMo-Minitron-8B-Base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nvidia/Mistral-NeMo-Minitron-8B-Base
cc94637b669b62c4829b1e0c3b9074fecd883b74
17.660162
other
169
7.88
true
false
false
false
3.404028
0.194566
19.456597
0.52191
31.822015
0.046073
4.607251
0.325503
10.067114
0.409156
8.944531
0.379571
31.06346
false
true
2024-08-19
2024-08-22
0
nvidia/Mistral-NeMo-Minitron-8B-Base