eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
7 values
T
stringclasses
7 values
Weight type
stringclasses
3 values
Architecture
stringclasses
62 values
Model
stringlengths
355
689
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
0.74
52
Hub License
stringclasses
27 values
Hub ❤️
int64
0
5.99k
#Params (B)
float64
-1
141
Available on the hub
bool
2 classes
MoE
bool
2 classes
Flagged
bool
2 classes
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.03
107
IFEval Raw
float64
0
0.9
IFEval
float64
0
90
BBH Raw
float64
0.24
0.75
BBH
float64
0.25
64.1
MATH Lvl 5 Raw
float64
0
0.52
MATH Lvl 5
float64
0
52.4
GPQA Raw
float64
0.21
0.47
GPQA
float64
0
29.4
MUSR Raw
float64
0.29
0.6
MUSR
float64
0
38.5
MMLU-PRO Raw
float64
0.1
0.73
MMLU-PRO
float64
0
70
Merged
bool
2 classes
Official Providers
bool
2 classes
Upload To Hub Date
stringclasses
480 values
Submission Date
stringclasses
220 values
Generation
int64
0
10
Base Model
stringlengths
4
102
yifAI_Llama-3-8B-Instruct-SPPO-score-Iter3_gp_8b-table-0.002_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/yifAI/Llama-3-8B-Instruct-SPPO-score-Iter3_gp_8b-table-0.002" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yifAI/Llama-3-8B-Instruct-SPPO-score-Iter3_gp_8b-table-0.002</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yifAI__Llama-3-8B-Instruct-SPPO-score-Iter3_gp_8b-table-0.002-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
yifAI/Llama-3-8B-Instruct-SPPO-score-Iter3_gp_8b-table-0.002
7a046b74179225d6055dd8aa601b5234f817b1e5
22.624782
0
8.03
false
false
false
true
0.672016
0.648966
64.896586
0.491452
27.281064
0.068731
6.873112
0.261745
1.565996
0.389875
7.134375
0.351978
27.997562
false
false
2024-09-30
0
Removed
ylalain_ECE-PRYMMAL-YL-1B-SLERP-V8_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/ylalain/ECE-PRYMMAL-YL-1B-SLERP-V8" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ylalain/ECE-PRYMMAL-YL-1B-SLERP-V8</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ylalain__ECE-PRYMMAL-YL-1B-SLERP-V8-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ylalain/ECE-PRYMMAL-YL-1B-SLERP-V8
2c00dbc74e55d42fbc8b08f474fb9568f820edb9
9.604139
apache-2.0
0
1.357
true
false
false
false
0.548428
0.150527
15.052727
0.397557
15.175392
0
0
0.28943
5.257271
0.387458
6.765625
0.238364
15.373818
false
false
2024-11-13
2024-11-13
0
ylalain/ECE-PRYMMAL-YL-1B-SLERP-V8
ymcki_gemma-2-2b-ORPO-jpn-it-abliterated-18_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-ORPO-jpn-it-abliterated-18" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-ORPO-jpn-it-abliterated-18</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-ORPO-jpn-it-abliterated-18-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ymcki/gemma-2-2b-ORPO-jpn-it-abliterated-18
aed2a9061ffa21beaec0d617a9605e160136aab4
14.633781
gemma
0
2.614
true
false
false
true
6.200402
0.463095
46.309459
0.40529
16.301992
0.003776
0.377644
0.288591
5.145414
0.375427
4.728385
0.234458
14.93979
false
false
2024-10-30
2024-11-16
3
google/gemma-2-2b
ymcki_gemma-2-2b-ORPO-jpn-it-abliterated-18-merge_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-ORPO-jpn-it-abliterated-18-merge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-ORPO-jpn-it-abliterated-18-merge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-ORPO-jpn-it-abliterated-18-merge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ymcki/gemma-2-2b-ORPO-jpn-it-abliterated-18-merge
b72be0a7879f0d82cb2024cfc1d02c370ce3efe8
15.737663
gemma
0
2.614
true
false
false
true
1.98799
0.521821
52.182099
0.414689
17.348337
0.008308
0.830816
0.283557
4.474273
0.351396
3.357813
0.246094
16.232639
false
false
2024-10-30
2024-11-16
3
google/gemma-2-2b
ymcki_gemma-2-2b-jpn-it-abliterated-17_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-jpn-it-abliterated-17" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-jpn-it-abliterated-17</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-jpn-it-abliterated-17-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ymcki/gemma-2-2b-jpn-it-abliterated-17
e6f82b93dae0b8207aa3252ab4157182e2610787
15.002982
gemma
1
2.614
true
false
false
true
1.104509
0.508157
50.815724
0.407627
16.234749
0
0
0.271812
2.908277
0.370062
3.891146
0.245512
16.167996
false
false
2024-10-16
2024-10-18
3
google/gemma-2-2b
ymcki_gemma-2-2b-jpn-it-abliterated-17-18-24_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-jpn-it-abliterated-17-18-24" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-jpn-it-abliterated-17-18-24</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-jpn-it-abliterated-17-18-24-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ymcki/gemma-2-2b-jpn-it-abliterated-17-18-24
38f56fcb99bd64278a1d90dd23aea527036329a0
14.019765
gemma
0
2.614
true
false
false
true
0.704859
0.505484
50.548434
0.381236
13.114728
0
0
0.28104
4.138702
0.350156
2.069531
0.228225
14.247193
false
false
2024-11-06
2024-11-06
3
google/gemma-2-2b
ymcki_gemma-2-2b-jpn-it-abliterated-17-ORPO_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-jpn-it-abliterated-17-ORPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-jpn-it-abliterated-17-ORPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-jpn-it-abliterated-17-ORPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ymcki/gemma-2-2b-jpn-it-abliterated-17-ORPO
531b2e2043285cb40cd0433f5ad43441f8ac6b6c
14.516851
gemma
1
2.614
true
false
false
true
9.681597
0.474785
47.478468
0.389798
14.389413
0.042296
4.229607
0.274329
3.243848
0.37676
4.528385
0.219082
13.231383
false
false
2024-10-18
2024-10-27
3
google/gemma-2-2b
ymcki_gemma-2-2b-jpn-it-abliterated-17-ORPO-alpaca_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-jpn-it-abliterated-17-ORPO-alpaca" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-jpn-it-abliterated-17-ORPO-alpaca</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-jpn-it-abliterated-17-ORPO-alpaca-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ymcki/gemma-2-2b-jpn-it-abliterated-17-ORPO-alpaca
5503b5e892be463fa4b1d265b8ba9ba4304af012
12.001731
0
2.614
false
false
false
true
1.184666
0.306473
30.647349
0.40716
16.922412
0.000755
0.075529
0.269295
2.572707
0.396917
7.914583
0.2249
13.877807
false
false
2024-10-27
0
Removed
ymcki_gemma-2-2b-jpn-it-abliterated-18_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-jpn-it-abliterated-18" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-jpn-it-abliterated-18</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-jpn-it-abliterated-18-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ymcki/gemma-2-2b-jpn-it-abliterated-18
c50b85f9b60b444f85fe230b8d77fcbc7b18ef91
15.503245
gemma
1
2.614
true
false
false
true
1.052664
0.517525
51.752461
0.413219
17.143415
0
0
0.27349
3.131991
0.374156
4.269531
0.250499
16.722074
false
false
2024-10-15
2024-10-18
3
google/gemma-2-2b
ymcki_gemma-2-2b-jpn-it-abliterated-18-ORPO_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-jpn-it-abliterated-18-ORPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-jpn-it-abliterated-18-ORPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-jpn-it-abliterated-18-ORPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ymcki/gemma-2-2b-jpn-it-abliterated-18-ORPO
b9f41f53827b8a5a600546b41f63023bf84617a3
14.943472
gemma
0
2.614
true
false
false
true
1.610377
0.474235
47.423503
0.403894
16.538079
0.035498
3.549849
0.261745
1.565996
0.395333
7.416667
0.218501
13.166741
false
false
2024-10-22
2024-10-22
3
google/gemma-2-2b
ymcki_gemma-2-2b-jpn-it-abliterated-24_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-jpn-it-abliterated-24" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-jpn-it-abliterated-24</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-jpn-it-abliterated-24-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ymcki/gemma-2-2b-jpn-it-abliterated-24
06c129ba5261ee88e32035c88f90ca11d835175d
15.604076
gemma
0
2.614
true
false
false
true
0.810442
0.497866
49.786566
0.41096
16.77259
0
0
0.277685
3.691275
0.39149
7.002865
0.24734
16.371158
false
false
2024-10-24
2024-10-25
3
google/gemma-2-2b
yuchenxie_ArlowGPT-3B-Multilingual_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/yuchenxie/ArlowGPT-3B-Multilingual" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yuchenxie/ArlowGPT-3B-Multilingual</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yuchenxie__ArlowGPT-3B-Multilingual-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
yuchenxie/ArlowGPT-3B-Multilingual
336f9084b4718be34ec7348e8082670539aebb4c
20.186472
mit
1
3.213
true
false
false
true
0.611822
0.639549
63.954862
0.43014
19.50317
0.093656
9.365559
0.280201
4.026846
0.372667
4.083333
0.281666
20.185062
false
false
2024-11-03
2025-01-12
1
yuchenxie/ArlowGPT-3B-Multilingual (Merge)
yuchenxie_ArlowGPT-8B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/yuchenxie/ArlowGPT-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yuchenxie/ArlowGPT-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yuchenxie__ArlowGPT-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
yuchenxie/ArlowGPT-8B
f7d0149059f1324a7725676b6ab67df59cd4c599
28.797338
mit
3
8.03
true
false
false
true
0.71678
0.784654
78.465361
0.508016
29.842909
0.193353
19.335347
0.293624
5.816555
0.388229
8.361979
0.378657
30.961879
false
false
2024-10-05
2025-01-12
1
yuchenxie/ArlowGPT-8B (Merge)
yuvraj17_Llama3-8B-SuperNova-Spectrum-Hermes-DPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/yuvraj17/Llama3-8B-SuperNova-Spectrum-Hermes-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yuvraj17/Llama3-8B-SuperNova-Spectrum-Hermes-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yuvraj17__Llama3-8B-SuperNova-Spectrum-Hermes-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
yuvraj17/Llama3-8B-SuperNova-Spectrum-Hermes-DPO
0da9f780f7dd94ed1e10c8d3e082472ff2922177
18.075579
apache-2.0
0
8.03
true
false
false
true
0.97203
0.46909
46.908979
0.439987
21.238563
0.055891
5.589124
0.302013
6.935123
0.401219
9.61901
0.263464
18.162677
false
false
2024-09-24
2024-09-30
0
yuvraj17/Llama3-8B-SuperNova-Spectrum-Hermes-DPO
yuvraj17_Llama3-8B-SuperNova-Spectrum-dare_ties_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/yuvraj17/Llama3-8B-SuperNova-Spectrum-dare_ties" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yuvraj17/Llama3-8B-SuperNova-Spectrum-dare_ties</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yuvraj17__Llama3-8B-SuperNova-Spectrum-dare_ties-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
yuvraj17/Llama3-8B-SuperNova-Spectrum-dare_ties
998d15b32900bc230727c8a7984e005f611723e9
19.134801
apache-2.0
0
8.03
true
false
false
false
0.914144
0.401271
40.127085
0.461579
23.492188
0.082326
8.232628
0.275168
3.355705
0.421094
11.003385
0.35738
28.597813
true
false
2024-09-22
2024-09-23
1
yuvraj17/Llama3-8B-SuperNova-Spectrum-dare_ties (Merge)
yuvraj17_Llama3-8B-abliterated-Spectrum-slerp_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/yuvraj17/Llama3-8B-abliterated-Spectrum-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yuvraj17/Llama3-8B-abliterated-Spectrum-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yuvraj17__Llama3-8B-abliterated-Spectrum-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
yuvraj17/Llama3-8B-abliterated-Spectrum-slerp
28789950975ecf5aac846c3f2c0a5d6841651ee6
17.687552
apache-2.0
0
8.03
true
false
false
false
0.82666
0.288488
28.848788
0.497791
28.54693
0.058157
5.81571
0.301174
6.823266
0.399823
11.011198
0.325715
25.079418
true
false
2024-09-22
2024-09-23
1
yuvraj17/Llama3-8B-abliterated-Spectrum-slerp (Merge)
zake7749_gemma-2-2b-it-chinese-kyara-dpo_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zake7749/gemma-2-2b-it-chinese-kyara-dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zake7749/gemma-2-2b-it-chinese-kyara-dpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zake7749__gemma-2-2b-it-chinese-kyara-dpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zake7749/gemma-2-2b-it-chinese-kyara-dpo
bbc011dae0416c1664a0287f3a7a0f9563deac91
19.334585
gemma
8
2.614
true
false
false
false
1.279309
0.538208
53.820751
0.425746
19.061804
0.066465
6.646526
0.266779
2.237136
0.457563
16.761979
0.257314
17.479314
false
false
2024-08-18
2024-10-17
1
zake7749/gemma-2-2b-it-chinese-kyara-dpo (Merge)
zelk12_Gemma-2-TM-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/Gemma-2-TM-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/Gemma-2-TM-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__Gemma-2-TM-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/Gemma-2-TM-9B
42366d605e6bdad354a5632547e37d34d300ff7a
30.151929
0
10.159
false
false
false
true
1.967893
0.804462
80.446216
0.598659
42.049491
0
0
0.346477
12.863535
0.41524
11.238281
0.408826
34.314051
false
false
2024-11-06
2024-11-06
1
zelk12/Gemma-2-TM-9B (Merge)
zelk12_MT-Gen1-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT-Gen1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Gen1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Gen1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT-Gen1-gemma-2-9B
b78f8883614cbbdf182ebb4acf8a8c124bc782ae
33.041356
0
10.159
false
false
false
true
3.362746
0.788625
78.862529
0.61
44.011247
0.133686
13.36858
0.346477
12.863535
0.421688
11.577604
0.438082
37.564642
false
false
2024-10-23
2024-10-23
1
zelk12/MT-Gen1-gemma-2-9B (Merge)
zelk12_MT-Gen2-GI-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT-Gen2-GI-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Gen2-GI-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Gen2-GI-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT-Gen2-GI-gemma-2-9B
e970fbcbf974f4626dcc6db7d2b02d4f24c72744
33.315847
1
10.159
false
false
false
true
1.868506
0.791398
79.139794
0.609556
44.002591
0.133686
13.36858
0.350671
13.422819
0.428323
12.673698
0.435588
37.287603
false
false
2024-11-10
2024-11-28
1
zelk12/MT-Gen2-GI-gemma-2-9B (Merge)
zelk12_MT-Gen2-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT-Gen2-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Gen2-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Gen2-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT-Gen2-gemma-2-9B
c723f8b9b7334fddd1eb8b6e5230b76fb18139a5
33.644495
1
10.159
false
false
false
true
1.989448
0.790749
79.074855
0.610049
44.107782
0.148792
14.879154
0.346477
12.863535
0.432292
13.303125
0.438747
37.63852
false
false
2024-11-10
2024-11-10
1
zelk12/MT-Gen2-gemma-2-9B (Merge)
zelk12_MT-Gen3-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT-Gen3-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Gen3-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Gen3-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT-Gen3-gemma-2-9B
84627594655776ce67f1e01233113b658333fa71
32.936869
2
10.159
false
false
false
true
1.813248
0.802014
80.201421
0.609711
43.950648
0.114048
11.404834
0.348993
13.199105
0.421688
11.577604
0.435588
37.287603
false
false
2024-11-28
2024-11-30
1
zelk12/MT-Gen3-gemma-2-9B (Merge)
zelk12_MT-Gen4-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT-Gen4-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Gen4-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Gen4-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT-Gen4-gemma-2-9B
d44beca936d18a5b4b65799487504c1097ae1cb2
33.722578
gemma
1
10.159
true
false
false
true
1.781683
0.788301
78.83006
0.610988
43.960404
0.165408
16.540785
0.354866
13.982103
0.422802
11.383594
0.438747
37.63852
true
false
2024-12-13
2024-12-13
1
zelk12/MT-Gen4-gemma-2-9B (Merge)
zelk12_MT-Gen5-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT-Gen5-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Gen5-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Gen5-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT-Gen5-gemma-2-9B
aef27049b2a3c52138016e9602280150f70eae32
33.78338
gemma
1
10.159
true
false
false
true
1.787428
0.792322
79.232215
0.613279
44.398244
0.168429
16.8429
0.35151
13.534676
0.420167
10.8875
0.440243
37.804743
true
false
2024-12-22
2024-12-22
1
zelk12/MT-Gen5-gemma-2-9B (Merge)
zelk12_MT-Gen6-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT-Gen6-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Gen6-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Gen6-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT-Gen6-gemma-2-9B
bd348fb1c1524e0d7d625200a292e46387b04da2
18.444364
gemma
1
10.159
true
false
false
true
2.008031
0.161567
16.156686
0.584467
39.396915
0
0
0.333054
11.073826
0.406927
8.865885
0.416556
35.172872
true
false
2025-01-23
2025-01-23
1
zelk12/MT-Gen6-gemma-2-9B (Merge)
zelk12_MT-Max-Merge_02012025163610-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT-Max-Merge_02012025163610-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Max-Merge_02012025163610-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Max-Merge_02012025163610-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT-Max-Merge_02012025163610-gemma-2-9B
2f279c5c648c22e77327d0c0098f90b69312afd3
34.049126
gemma
1
10.159
true
false
false
true
1.846119
0.790749
79.074855
0.614224
44.501684
0.182024
18.202417
0.35151
13.534676
0.422802
11.25026
0.439578
37.730866
true
false
2025-01-02
2025-01-02
1
zelk12/MT-Max-Merge_02012025163610-gemma-2-9B (Merge)
zelk12_MT-Merge-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT-Merge-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Merge-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Merge-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT-Merge-gemma-2-9B
f4c3b001bc8692bcbbd7005b6f8db048e651aa46
33.393208
3
10.159
false
false
false
true
3.219056
0.803538
80.353795
0.611838
44.320842
0.13142
13.141994
0.348154
13.087248
0.425625
12.103125
0.43617
37.352246
false
false
2024-10-22
2024-10-22
1
zelk12/MT-Merge-gemma-2-9B (Merge)
zelk12_MT-Merge1-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT-Merge1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Merge1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Merge1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT-Merge1-gemma-2-9B
71bb4577c877715f3f6646a224b184544639c856
33.130536
1
10.159
false
false
false
true
4.036662
0.788625
78.862529
0.61
44.058246
0.126888
12.688822
0.35151
13.534676
0.424385
12.148177
0.437417
37.490765
false
false
2024-11-07
2024-11-07
1
zelk12/MT-Merge1-gemma-2-9B (Merge)
zelk12_MT-Merge2-MU-gemma-2-MTg2MT1g2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT-Merge2-MU-gemma-2-MTg2MT1g2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Merge2-MU-gemma-2-MTg2MT1g2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Merge2-MU-gemma-2-MTg2MT1g2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT-Merge2-MU-gemma-2-MTg2MT1g2-9B
6d73ec2204800f7978c376567d3c6361c0a072cd
33.557528
2
10.159
false
false
false
true
1.844885
0.795595
79.559458
0.608389
43.8402
0.138218
13.821752
0.350671
13.422819
0.432229
13.228646
0.437251
37.472296
false
false
2024-11-25
2024-11-28
1
zelk12/MT-Merge2-MU-gemma-2-MTg2MT1g2-9B (Merge)
zelk12_MT-Merge2-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT-Merge2-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Merge2-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Merge2-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT-Merge2-gemma-2-9B
a695e722e6fab77852f9fe59bbc4d69fe23c4208
33.498975
2
10.159
false
false
false
true
1.850791
0.787701
78.770108
0.610668
44.157197
0.155589
15.558912
0.350671
13.422819
0.421688
11.510938
0.438165
37.573877
false
false
2024-11-25
2024-11-25
1
zelk12/MT-Merge2-gemma-2-9B (Merge)
zelk12_MT-Merge3-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT-Merge3-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Merge3-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Merge3-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT-Merge3-gemma-2-9B
3f02f5e76d3aade3340307eb34b15bc9dd5a2023
33.192106
gemma
0
10.159
true
false
false
true
1.881021
0.785853
78.585265
0.610211
44.066073
0.133686
13.36858
0.348993
13.199105
0.42575
12.452083
0.437334
37.481531
true
false
2024-12-11
2024-12-11
1
zelk12/MT-Merge3-gemma-2-9B (Merge)
zelk12_MT-Merge4-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT-Merge4-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Merge4-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Merge4-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT-Merge4-gemma-2-9B
5f076ad8a3f3c403840a1cd572a6018bea34e889
34.12096
gemma
1
10.159
true
false
false
true
1.873385
0.780732
78.073179
0.611822
44.053492
0.188066
18.806647
0.352349
13.646532
0.429437
12.479688
0.438996
37.666223
true
false
2024-12-21
2024-12-21
1
zelk12/MT-Merge4-gemma-2-9B (Merge)
zelk12_MT-Merge5-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT-Merge5-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Merge5-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Merge5-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT-Merge5-gemma-2-9B
d8adfc6c5395baaeb3f5e0b50c585ed3f662c4d9
33.647426
gemma
2
10.159
true
false
false
true
1.80079
0.784379
78.437878
0.612267
44.240598
0.155589
15.558912
0.353188
13.758389
0.428135
12.25026
0.438747
37.63852
true
false
2024-12-30
2024-12-30
1
zelk12/MT-Merge5-gemma-2-9B (Merge)
zelk12_MT-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT-gemma-2-9B
24e1f894517b86dd866c1a5999ced4a5924dcd90
30.239612
2
10.159
false
false
false
true
3.023399
0.796843
79.684349
0.60636
43.324243
0.003021
0.302115
0.345638
12.751678
0.407115
9.55599
0.422374
35.819297
false
false
2024-10-11
2024-10-11
1
zelk12/MT-gemma-2-9B (Merge)
zelk12_MT1-Gen1-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT1-Gen1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT1-Gen1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT1-Gen1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT1-Gen1-gemma-2-9B
939ac6c12059a18fc1117cdb3861f46816eff2fb
33.232259
0
10.159
false
false
false
true
3.362485
0.797443
79.744301
0.611779
44.273282
0.122356
12.23565
0.34396
12.527964
0.430958
13.103125
0.437583
37.509235
false
false
2024-10-23
2024-10-24
1
zelk12/MT1-Gen1-gemma-2-9B (Merge)
zelk12_MT1-Gen2-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT1-Gen2-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT1-Gen2-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT1-Gen2-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT1-Gen2-gemma-2-9B
aeaca7dc7d50a425a5d3c38d7c4a7daf1c772ad4
33.142398
2
10.159
false
false
false
true
1.995995
0.798367
79.836722
0.609599
43.919191
0.113293
11.329305
0.352349
13.646532
0.428354
12.844271
0.435505
37.278369
false
false
2024-11-11
2024-11-11
1
zelk12/MT1-Gen2-gemma-2-9B (Merge)
zelk12_MT1-Gen3-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT1-Gen3-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT1-Gen3-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT1-Gen3-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT1-Gen3-gemma-2-9B
5cc4ee1c70f08a5b1a195d43f044d9bf6fca29f5
32.964927
0
10.159
false
false
false
true
1.944877
0.795969
79.596914
0.610155
43.990306
0.117825
11.782477
0.348993
13.199105
0.424323
12.007031
0.434924
37.213726
false
false
2024-12-01
2024-12-01
1
zelk12/MT1-Gen3-gemma-2-9B (Merge)
zelk12_MT1-Gen4-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT1-Gen4-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT1-Gen4-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT1-Gen4-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT1-Gen4-gemma-2-9B
5eaf1ef67f32805c6fbc0b51418a8caf866661a2
31.507236
gemma
1
10.159
true
false
false
true
1.742062
0.794121
79.412071
0.605757
43.145368
0.049094
4.909366
0.347315
12.975391
0.423115
12.089323
0.428607
36.511894
true
false
2024-12-14
2024-12-14
1
zelk12/MT1-Gen4-gemma-2-9B (Merge)
zelk12_MT1-Gen5-IF-gemma-2-S2DMv1-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT1-Gen5-IF-gemma-2-S2DMv1-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT1-Gen5-IF-gemma-2-S2DMv1-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT1-Gen5-IF-gemma-2-S2DMv1-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT1-Gen5-IF-gemma-2-S2DMv1-9B
53a780fd3a2d42709a0f517cac019234d7d71267
30.810248
1
10.159
false
false
false
true
1.707127
0.792922
79.292167
0.6
42.201028
0.024924
2.492447
0.34396
12.527964
0.424479
12.593229
0.421792
35.754654
false
false
2024-12-24
2024-12-31
1
zelk12/MT1-Gen5-IF-gemma-2-S2DMv1-9B (Merge)
zelk12_MT1-Gen5-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT1-Gen5-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT1-Gen5-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT1-Gen5-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT1-Gen5-gemma-2-9B
4eb54f9a0a9f482537b0e79000ffe7fb9d024c38
30.636174
gemma
1
10.159
true
false
false
true
1.75523
0.779483
77.948288
0.601746
42.496764
0.032477
3.247734
0.346477
12.863535
0.419146
11.459896
0.422207
35.800827
true
false
2024-12-24
2024-12-24
1
zelk12/MT1-Gen5-gemma-2-9B (Merge)
zelk12_MT1-Max-Merge_02012025163610-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT1-Max-Merge_02012025163610-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT1-Max-Merge_02012025163610-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT1-Max-Merge_02012025163610-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT1-Max-Merge_02012025163610-gemma-2-9B
e9177c45a9dc1ff2ace378d4809ea92ff6e477c4
33.853363
gemma
1
10.159
true
false
false
true
1.87798
0.792872
79.28718
0.612267
44.226377
0.161631
16.163142
0.354866
13.982103
0.4255
11.8875
0.438165
37.573877
true
false
2025-01-04
2025-01-04
1
zelk12/MT1-Max-Merge_02012025163610-gemma-2-9B (Merge)
zelk12_MT1-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT1-gemma-2-9B
3a5e77518ca9c3c8ea2edac4c03bc220ee91f3ed
33.633829
2
10.159
false
false
false
true
3.345719
0.79467
79.467036
0.610875
44.161526
0.149547
14.954683
0.345638
12.751678
0.432229
13.161979
0.435755
37.306073
false
false
2024-10-12
2024-10-14
1
zelk12/MT1-gemma-2-9B (Merge)
zelk12_MT2-Gen1-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT2-Gen1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT2-Gen1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT2-Gen1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT2-Gen1-gemma-2-9B
167abf8eb4ea01fecd42dc32ad68160c51a8685a
32.460223
0
10.159
false
false
false
true
3.38321
0.785578
78.557782
0.61008
44.141103
0.101208
10.120846
0.343121
12.416107
0.424323
12.007031
0.437666
37.518469
false
false
2024-10-24
2024-10-27
1
zelk12/MT2-Gen1-gemma-2-9B (Merge)
zelk12_MT2-Gen2-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT2-Gen2-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT2-Gen2-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT2-Gen2-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT2-Gen2-gemma-2-9B
24c487499b5833424ffb9932eed838bb254f61b4
33.471172
3
10.159
false
false
false
true
2.037441
0.7889
78.890012
0.609292
44.044503
0.148036
14.803625
0.346477
12.863535
0.427021
12.577604
0.43883
37.647754
false
false
2024-11-12
2024-11-12
1
zelk12/MT2-Gen2-gemma-2-9B (Merge)
zelk12_MT2-Gen3-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT2-Gen3-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT2-Gen3-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT2-Gen3-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT2-Gen3-gemma-2-9B
bb750c2b76328c6dbc9adf9ae3d09551f3723758
32.967895
1
10.159
false
false
false
true
1.924377
0.781007
78.100662
0.610477
44.007274
0.132931
13.293051
0.346477
12.863535
0.423083
12.052083
0.437417
37.490765
false
false
2024-12-04
2024-12-04
1
zelk12/MT2-Gen3-gemma-2-9B (Merge)
zelk12_MT2-Gen4-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT2-Gen4-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT2-Gen4-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT2-Gen4-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT2-Gen4-gemma-2-9B
7a07de3719c3b8b8e90e79a65798bcc4ef454fc6
31.860932
gemma
1
10.159
true
false
false
true
1.786436
0.789599
78.959937
0.609655
43.778362
0.083082
8.308157
0.345638
12.751678
0.412542
10.467708
0.432098
36.899749
true
false
2024-12-15
2024-12-15
1
zelk12/MT2-Gen4-gemma-2-9B (Merge)
zelk12_MT2-Gen5-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT2-Gen5-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT2-Gen5-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT2-Gen5-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT2-Gen5-gemma-2-9B
94711cc263eab1464fa6b01c28ee5171b4467d84
31.594551
gemma
0
10.159
true
false
false
true
1.762136
0.774912
77.491168
0.606393
43.124281
0.063444
6.344411
0.35151
13.534676
0.424417
12.385417
0.430186
36.687352
true
false
2024-12-25
2024-12-25
1
zelk12/MT2-Gen5-gemma-2-9B (Merge)
zelk12_MT2-Max-Merge_02012025163610-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT2-Max-Merge_02012025163610-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT2-Max-Merge_02012025163610-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT2-Max-Merge_02012025163610-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT2-Max-Merge_02012025163610-gemma-2-9B
76d8a9cc371af30b5843fb69edc25ff767d6741f
33.768996
gemma
0
10.159
true
false
false
true
1.828975
0.790149
79.014903
0.610846
44.040817
0.16994
16.993958
0.35151
13.534676
0.422833
11.354167
0.439079
37.675458
true
false
2025-01-07
2025-01-07
1
zelk12/MT2-Max-Merge_02012025163610-gemma-2-9B (Merge)
zelk12_MT2-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT2-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT2-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT2-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT2-gemma-2-9B
d20d7169ce0f53d586504c50b4b7dc470bf8a781
33.2825
1
10.159
false
false
false
true
3.19411
0.788575
78.857542
0.611511
44.167481
0.147281
14.728097
0.347315
12.975391
0.421656
11.540365
0.436835
37.426123
false
false
2024-10-14
2024-10-15
1
zelk12/MT2-gemma-2-9B (Merge)
zelk12_MT3-Gen1-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT3-Gen1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT3-Gen1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT3-Gen1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT3-Gen1-gemma-2-9B
cd78df9e67e2e710d8d305f5a03a92c01b1b425d
31.054845
1
10.159
false
false
false
true
3.113666
0.783779
78.377926
0.610676
44.119495
0.032477
3.247734
0.346477
12.863535
0.415115
10.75599
0.43268
36.964391
false
false
2024-10-24
2024-10-28
1
zelk12/MT3-Gen1-gemma-2-9B (Merge)
zelk12_MT3-Gen2-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT3-Gen2-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT3-Gen2-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT3-Gen2-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT3-Gen2-gemma-2-9B
e4ef057d20751d89934025e9088ba98d89b921b5
30.963626
1
10.159
false
false
false
true
1.919108
0.784329
78.432891
0.609147
43.940226
0.020393
2.039275
0.357383
14.317673
0.411115
10.022656
0.433261
37.029034
false
false
2024-11-20
2024-11-20
1
zelk12/MT3-Gen2-gemma-2-9B (Merge)
zelk12_MT3-Gen3-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT3-Gen3-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT3-Gen3-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT3-Gen3-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT3-Gen3-gemma-2-9B
4ad54d6295f6364aa87f7aaa2a7bd112fb92ec00
32.359994
1
10.159
false
false
false
true
1.904463
0.785628
78.562769
0.608889
43.78374
0.090634
9.063444
0.35151
13.534676
0.42575
12.51875
0.430269
36.696587
false
false
2024-12-07
2024-12-07
1
zelk12/MT3-Gen3-gemma-2-9B (Merge)
zelk12_MT3-Gen4-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT3-Gen4-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT3-Gen4-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT3-Gen4-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT3-Gen4-gemma-2-9B
22066f5a275797ae870d2c58e8c75ac933ee1adf
34.492356
gemma
4
10.159
true
false
false
true
1.820853
0.773713
77.371264
0.610084
43.779591
0.204683
20.468278
0.347315
12.975391
0.447635
14.721094
0.438747
37.63852
true
false
2024-12-16
2024-12-16
1
zelk12/MT3-Gen4-gemma-2-9B (Merge)
zelk12_MT3-Gen5-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT3-Gen5-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT3-Gen5-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT3-Gen5-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT3-Gen5-gemma-2-9B
b02315782a4719734b159220ca1eef0770d022a5
32.189706
gemma
1
10.159
true
false
false
true
1.940627
0.799017
79.901661
0.609862
43.951199
0.072508
7.250755
0.353188
13.758389
0.419115
11.422656
0.431682
36.853576
true
false
2024-12-26
2024-12-26
1
zelk12/MT3-Gen5-gemma-2-9B (Merge)
zelk12_MT3-Gen5-gemma-2-9B_v1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT3-Gen5-gemma-2-9B_v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT3-Gen5-gemma-2-9B_v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT3-Gen5-gemma-2-9B_v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT3-Gen5-gemma-2-9B_v1
40bfcc25ff421ff83d67a9c46474a0b40abf4f4a
32.846339
gemma
2
10.159
true
false
false
true
1.928152
0.799616
79.961613
0.611333
44.159602
0.109517
10.951662
0.348993
13.199105
0.420385
11.48151
0.435921
37.324542
true
false
2024-12-27
2024-12-27
1
zelk12/MT3-Gen5-gemma-2-9B_v1 (Merge)
zelk12_MT3-Max-Merge_02012025163610-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT3-Max-Merge_02012025163610-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT3-Max-Merge_02012025163610-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT3-Max-Merge_02012025163610-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT3-Max-Merge_02012025163610-gemma-2-9B
6499758258ac6278e7fdc4ba6719ab28f35709e8
22.064889
gemma
0
10.159
true
false
false
true
1.946282
0.176155
17.615482
0.612346
44.20652
0.077039
7.703927
0.350671
13.422819
0.425469
11.783594
0.438913
37.656989
true
false
2025-01-09
2025-01-09
1
zelk12/MT3-Max-Merge_02012025163610-gemma-2-9B (Merge)
zelk12_MT3-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT3-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT3-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT3-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT3-gemma-2-9B
d501b6ea59896fac3dc0a623501a5493b3573cde
32.352524
1
10.159
false
false
false
true
3.136653
0.778609
77.860854
0.613078
44.248465
0.104985
10.498489
0.344799
12.639821
0.424292
11.903125
0.43268
36.964391
false
false
2024-10-15
2024-10-16
1
zelk12/MT3-gemma-2-9B (Merge)
zelk12_MT4-Gen1-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT4-Gen1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT4-Gen1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT4-Gen1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT4-Gen1-gemma-2-9B
6ed2c66246c7f354decfd3579acb534dc4b0b48c
33.544994
0
10.159
false
false
false
true
2.103561
0.7895
78.949964
0.609383
44.009524
0.150302
15.030211
0.34396
12.527964
0.432229
13.095313
0.438913
37.656989
false
false
2024-10-25
2024-10-29
1
zelk12/MT4-Gen1-gemma-2-9B (Merge)
zelk12_MT4-Gen2-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT4-Gen2-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT4-Gen2-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT4-Gen2-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT4-Gen2-gemma-2-9B
4d61a5799b11641a24e8b0f3eda0e987ff392089
33.794732
4
10.159
false
false
false
true
1.977047
0.805062
80.506168
0.610835
44.176658
0.1571
15.70997
0.345638
12.751678
0.425656
12.207031
0.436752
37.416888
false
false
2024-11-22
2024-11-22
1
zelk12/MT4-Gen2-gemma-2-9B (Merge)
zelk12_MT4-Gen3-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT4-Gen3-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT4-Gen3-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT4-Gen3-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT4-Gen3-gemma-2-9B
f93026d28ca1707e8c21620be8558eed6be43b1c
33.239752
0
10.159
false
false
false
true
1.958701
0.784054
78.405409
0.608711
43.89439
0.151057
15.10574
0.34396
12.527964
0.424323
11.940365
0.438082
37.564642
false
false
2024-12-08
2024-12-08
1
zelk12/MT4-Gen3-gemma-2-9B (Merge)
zelk12_MT4-Gen4-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT4-Gen4-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT4-Gen4-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT4-Gen4-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT4-Gen4-gemma-2-9B
51f3deb0aba90d82fc3f21894b3171fa5afbffa5
32.090103
gemma
0
10.159
true
false
false
true
1.793741
0.787426
78.742625
0.607603
43.47581
0.077039
7.703927
0.352349
13.646532
0.424354
12.044271
0.432347
36.927453
true
false
2024-12-19
2024-12-19
1
zelk12/MT4-Gen4-gemma-2-9B (Merge)
zelk12_MT4-Gen5-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT4-Gen5-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT4-Gen5-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT4-Gen5-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT4-Gen5-gemma-2-9B
59681ccdc7e6f1991cc5663464806665bc3bf4c8
33.423935
gemma
2
10.159
true
false
false
true
1.847362
0.778883
77.888336
0.610666
43.947892
0.148792
14.879154
0.356544
14.205817
0.426833
12.020833
0.438414
37.601581
true
false
2024-12-28
2024-12-28
1
zelk12/MT4-Gen5-gemma-2-9B (Merge)
zelk12_MT4-Max-Merge_02012025163610-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT4-Max-Merge_02012025163610-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT4-Max-Merge_02012025163610-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT4-Max-Merge_02012025163610-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT4-Max-Merge_02012025163610-gemma-2-9B
25e64938f38ed3db0113007a2814b069fd2952b0
22.004747
gemma
0
10.159
true
false
false
true
1.9584
0.177079
17.707904
0.612013
44.173982
0.075529
7.55287
0.35151
13.534676
0.422802
11.383594
0.439079
37.675458
true
false
2025-01-11
2025-01-11
1
zelk12/MT4-Max-Merge_02012025163610-gemma-2-9B (Merge)
zelk12_MT4-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT4-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT4-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT4-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT4-gemma-2-9B
2167ea02baf9145a697a7d828a17c75b86e5e282
33.447349
0
10.159
false
false
false
true
3.155259
0.776161
77.616059
0.607314
43.553827
0.173716
17.371601
0.338087
11.744966
0.430927
12.999219
0.436586
37.398419
false
false
2024-10-16
2024-10-20
1
zelk12/MT4-gemma-2-9B (Merge)
zelk12_MT5-Gen1-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT5-Gen1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT5-Gen1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT5-Gen1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT5-Gen1-gemma-2-9B
0291b776e80f38381788cd8f1fb2c3435ad891b5
31.897632
0
10.159
false
false
false
true
2.017253
0.78313
78.312987
0.611048
44.183335
0.068731
6.873112
0.347315
12.975391
0.420385
11.614844
0.436835
37.426123
false
false
2024-10-25
2024-10-31
1
zelk12/MT5-Gen1-gemma-2-9B (Merge)
zelk12_MT5-Gen2-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT5-Gen2-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT5-Gen2-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT5-Gen2-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT5-Gen2-gemma-2-9B
3ee2822fcba6708bd9032b79249a2789e5996b6a
32.600392
1
10.159
false
false
false
true
1.858381
0.796244
79.624397
0.610541
44.113215
0.103474
10.347432
0.35151
13.534676
0.416292
10.436458
0.437916
37.546173
false
false
2024-11-23
2024-11-23
1
zelk12/MT5-Gen2-gemma-2-9B (Merge)
zelk12_MT5-Gen3-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT5-Gen3-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT5-Gen3-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT5-Gen3-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT5-Gen3-gemma-2-9B
4b3811c689fec5c9cc483bb1ed696734e5e88fcf
32.801838
0
10.159
false
false
false
true
1.937333
0.78253
78.253035
0.609049
43.885913
0.115559
11.555891
0.35151
13.534676
0.423052
12.08151
0.4375
37.5
false
false
2024-12-08
2024-12-08
1
zelk12/MT5-Gen3-gemma-2-9B (Merge)
zelk12_MT5-Gen4-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT5-Gen4-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT5-Gen4-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT5-Gen4-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT5-Gen4-gemma-2-9B
2f826d76460a5b7f150622a57f2d5419adfc464f
33.765135
gemma
0
10.159
true
false
false
true
1.82172
0.783455
78.345457
0.613106
44.323211
0.170695
17.069486
0.353188
13.758389
0.422833
11.354167
0.439661
37.7401
true
false
2024-12-20
2024-12-20
1
zelk12/MT5-Gen4-gemma-2-9B (Merge)
zelk12_MT5-Gen5-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT5-Gen5-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT5-Gen5-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT5-Gen5-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT5-Gen5-gemma-2-9B
d1f68652d7dda810da8207a371d26376c6a6e847
32.091454
gemma
1
10.159
true
false
false
true
1.891645
0.79472
79.472023
0.611166
44.115081
0.073263
7.326284
0.348154
13.087248
0.419115
11.55599
0.432929
36.992095
true
false
2024-12-29
2024-12-29
1
zelk12/MT5-Gen5-gemma-2-9B (Merge)
zelk12_MT5-Max-Merge_02012025163610-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT5-Max-Merge_02012025163610-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT5-Max-Merge_02012025163610-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT5-Max-Merge_02012025163610-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT5-Max-Merge_02012025163610-gemma-2-9B
a90f9ca13af28c72695fabc56da4ddd8e3a8e173
22.001289
gemma
0
10.159
true
false
false
true
2.062209
0.176155
17.615482
0.612679
44.274407
0.077039
7.703927
0.35151
13.534676
0.422771
11.213021
0.438996
37.666223
true
false
2025-01-14
2025-01-14
1
zelk12/MT5-Max-Merge_02012025163610-gemma-2-9B (Merge)
zelk12_MT5-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT5-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT5-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT5-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT5-gemma-2-9B
b627ae7d796b1ae85b59c55e0e043b8d3ae73d83
32.595305
0
10.159
false
false
false
true
3.26983
0.804787
80.478685
0.611223
44.271257
0.095166
9.516616
0.343121
12.416107
0.420385
11.48151
0.436669
37.407654
false
false
2024-10-19
2024-10-21
1
zelk12/MT5-gemma-2-9B (Merge)
zelk12_MTM-Merge-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MTM-Merge-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MTM-Merge-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MTM-Merge-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MTM-Merge-gemma-2-9B
843f23c68cf50f5bdc0206f93e72ce0f9feeca6e
33.758993
gemma
2
10.159
true
false
false
true
1.793346
0.779808
77.980758
0.613335
44.380677
0.166163
16.616314
0.354866
13.982103
0.426771
11.946354
0.43883
37.647754
true
false
2025-01-01
2025-01-01
1
zelk12/MTM-Merge-gemma-2-9B (Merge)
zelk12_MTMaMe-Merge_02012025163610-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MTMaMe-Merge_02012025163610-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MTMaMe-Merge_02012025163610-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MTMaMe-Merge_02012025163610-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MTMaMe-Merge_02012025163610-gemma-2-9B
ce68b2468bcba0c5dcde79bbf5346db81f069b12
22.020442
gemma
0
10.159
true
false
false
true
1.89784
0.178603
17.860277
0.611679
44.160463
0.074018
7.401813
0.352349
13.646532
0.424104
11.479687
0.438165
37.573877
true
false
2025-01-16
2025-01-16
1
zelk12/MTMaMe-Merge_02012025163610-gemma-2-9B (Merge)
zelk12_Rv0.4DMv1t0.25-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/Rv0.4DMv1t0.25-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/Rv0.4DMv1t0.25-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__Rv0.4DMv1t0.25-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/Rv0.4DMv1t0.25-gemma-2-9B
23e7337dabbf023177c25ded4923286a2e3936fc
33.585317
0
10.159
false
false
false
true
1.918645
0.749658
74.965758
0.606971
43.664764
0.194109
19.410876
0.345638
12.751678
0.430927
12.932552
0.440076
37.786274
false
false
2024-12-31
2024-12-31
1
zelk12/Rv0.4DMv1t0.25-gemma-2-9B (Merge)
zelk12_Rv0.4DMv1t0.25Tt0.25-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/Rv0.4DMv1t0.25Tt0.25-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/Rv0.4DMv1t0.25Tt0.25-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__Rv0.4DMv1t0.25Tt0.25-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/Rv0.4DMv1t0.25Tt0.25-gemma-2-9B
28fbcc2fa23f46aaaed327984784251527c78815
32.304
gemma
0
10.159
true
false
false
true
1.912583
0.76462
76.46201
0.609786
43.914819
0.112538
11.253776
0.342282
12.304251
0.428292
12.703125
0.434674
37.186022
true
false
2024-12-31
2024-12-31
1
zelk12/Rv0.4DMv1t0.25Tt0.25-gemma-2-9B (Merge)
zelk12_Rv0.4MT4g2-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/Rv0.4MT4g2-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/Rv0.4MT4g2-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__Rv0.4MT4g2-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/Rv0.4MT4g2-gemma-2-9B
ef595241d2c62203c27d84e6643d384a7cf99bd4
33.004199
gemma
1
10.159
true
false
false
true
1.853253
0.732022
73.202215
0.60412
43.199046
0.179758
17.975831
0.353188
13.758389
0.423083
11.91875
0.441739
37.970966
true
false
2025-01-04
2025-01-04
1
zelk12/Rv0.4MT4g2-gemma-2-9B (Merge)
zelk12_T31122024203920-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/T31122024203920-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/T31122024203920-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__T31122024203920-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/T31122024203920-gemma-2-9B
25cb58c73a3adf43cee33b50238b1d332b5ccc13
33.101317
gemma
0
10.159
true
false
false
true
1.866369
0.767618
76.76177
0.609563
43.728997
0.138973
13.897281
0.350671
13.422819
0.432198
13.32474
0.437251
37.472296
true
false
2024-12-31
2024-12-31
1
zelk12/T31122024203920-gemma-2-9B (Merge)
zelk12_Test01012025155054_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/Test01012025155054" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/Test01012025155054</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__Test01012025155054-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/Test01012025155054
c607186b0b079975e3305e0223e0a55f0cbc19e5
3.591417
0
3.817
false
false
false
true
0.700474
0.155523
15.55229
0.28295
1.280547
0
0
0.241611
0
0.367021
3.710937
0.109043
1.004728
false
false
2025-01-01
2025-01-01
1
zelk12/Test01012025155054 (Merge)
zelk12_Test01012025155054t0.5_gemma-2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/Test01012025155054t0.5_gemma-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/Test01012025155054t0.5_gemma-2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__Test01012025155054t0.5_gemma-2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/Test01012025155054t0.5_gemma-2
14fcae0d420d303df84bd9b9c8744a6f0fa147fb
3.591417
0
3.817
false
false
false
true
0.697964
0.155523
15.55229
0.28295
1.280547
0
0
0.241611
0
0.367021
3.710937
0.109043
1.004728
false
false
2025-01-01
2025-01-01
1
zelk12/Test01012025155054t0.5_gemma-2 (Merge)
zelk12_gemma-2-S2MTM-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/gemma-2-S2MTM-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/gemma-2-S2MTM-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__gemma-2-S2MTM-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/gemma-2-S2MTM-9B
fd6860743943114eeca6fc2e800e27c87873bcc5
31.148621
gemma
0
10.159
true
false
false
true
1.765103
0.782256
78.225553
0.606084
43.115728
0.04003
4.003021
0.345638
12.751678
0.421844
12.163802
0.429688
36.631944
true
false
2024-12-11
2024-12-11
1
zelk12/gemma-2-S2MTM-9B (Merge)
zelk12_recoilme-gemma-2-Ataraxy-9B-v0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Ataraxy-9B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1
b4208ddf6c741884c16c77b9433d9ead8f216354
30.344893
2
10.159
false
false
false
true
3.443191
0.764895
76.489492
0.607451
43.706516
0.013595
1.359517
0.349832
13.310962
0.413625
10.303125
0.432098
36.899749
false
false
2024-10-03
2024-10-03
1
zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1 (Merge)
zelk12_recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25
e652c9e07265526851dad994f4640aa265b9ab56
33.300246
1
10.159
false
false
false
true
3.194991
0.770665
77.066517
0.607543
43.85035
0.155589
15.558912
0.343121
12.416107
0.43226
13.132552
0.439993
37.777039
false
false
2024-10-04
2024-10-04
1
zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25 (Merge)
zelk12_recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75
eb0e589291630ba20328db650f74af949d217a97
28.421762
0
10.159
false
false
false
true
3.751453
0.720806
72.080635
0.59952
42.487153
0
0
0.349832
13.310962
0.395115
7.75599
0.414063
34.895833
false
false
2024-10-04
2024-10-04
1
zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75 (Merge)
zelk12_recoilme-gemma-2-Ataraxy-9B-v0.2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Ataraxy-9B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Ataraxy-9B-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Ataraxy-9B-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/recoilme-gemma-2-Ataraxy-9B-v0.2
76f56b25bf6d8704282f8c77bfda28ca384883bc
30.113979
1
10.159
false
false
false
true
3.413675
0.759999
75.999902
0.606626
43.633588
0.012085
1.208459
0.348154
13.087248
0.410958
9.836458
0.432264
36.918218
false
false
2024-10-07
2024-10-11
1
zelk12/recoilme-gemma-2-Ataraxy-9B-v0.2 (Merge)
zelk12_recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1
1e3e623e9f0b386bfd967c629dd39c87daef5bed
31.626376
1
10.159
false
false
false
true
6.461752
0.761523
76.152276
0.609878
43.941258
0.073263
7.326284
0.341443
12.192394
0.431021
13.310937
0.431516
36.835106
false
false
2024-10-07
2024-10-07
1
zelk12/recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1 (Merge)
zelk12_recoilme-gemma-2-Ifable-9B-v0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Ifable-9B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Ifable-9B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Ifable-9B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/recoilme-gemma-2-Ifable-9B-v0.1
8af6620b39c9a36239879b6b2bd88f66e9e9d930
32.254423
0
10.159
false
false
false
true
6.542869
0.794396
79.439554
0.60644
43.39057
0.09139
9.138973
0.35151
13.534676
0.420229
11.095313
0.432347
36.927453
false
false
2024-10-07
2024-10-07
1
zelk12/recoilme-gemma-2-Ifable-9B-v0.1 (Merge)
zelk12_recoilme-gemma-2-psy10k-mental_healt-9B-v0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-psy10k-mental_healt-9B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-psy10k-mental_healt-9B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-psy10k-mental_healt-9B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/recoilme-gemma-2-psy10k-mental_healt-9B-v0.1
ced039b03be6f65ac0f713efcee76c6534e65639
32.448061
1
10.159
false
false
false
true
3.13222
0.744537
74.453672
0.597759
42.132683
0.180514
18.05136
0.34396
12.527964
0.429469
12.183594
0.418052
35.339096
false
false
2024-10-07
2024-10-07
1
zelk12/recoilme-gemma-2-psy10k-mental_healt-9B-v0.1 (Merge)
zetasepic_Qwen2.5-32B-Instruct-abliterated-v2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/zetasepic/Qwen2.5-32B-Instruct-abliterated-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zetasepic/Qwen2.5-32B-Instruct-abliterated-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zetasepic__Qwen2.5-32B-Instruct-abliterated-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zetasepic/Qwen2.5-32B-Instruct-abliterated-v2
5894fbf0a900e682dfc0ed794db337093bd8d26b
36.969237
apache-2.0
7
32.764
true
false
false
true
6.744789
0.833413
83.341312
0.693402
56.533818
0
0
0.36745
15.659955
0.435427
14.928385
0.562168
51.35195
false
false
2024-10-11
2024-12-07
2
Qwen/Qwen2.5-32B
zetasepic_Qwen2.5-72B-Instruct-abliterated_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/zetasepic/Qwen2.5-72B-Instruct-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zetasepic/Qwen2.5-72B-Instruct-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zetasepic__Qwen2.5-72B-Instruct-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zetasepic/Qwen2.5-72B-Instruct-abliterated
af94b3c05c9857dbac73afb1cbce00e4833ec9ef
45.293139
other
24
72.706
true
false
false
false
18.809182
0.715261
71.526106
0.715226
59.912976
0.46148
46.148036
0.406879
20.917226
0.471917
19.122917
0.587184
54.131575
false
false
2024-10-01
2024-11-08
2
Qwen/Qwen2.5-72B
zhengr_MixTAO-7Bx2-MoE-v8.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/zhengr/MixTAO-7Bx2-MoE-v8.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zhengr/MixTAO-7Bx2-MoE-v8.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zhengr__MixTAO-7Bx2-MoE-v8.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zhengr/MixTAO-7Bx2-MoE-v8.1
828e963abf2db0f5af9ed0d4034e538fc1cf5f40
17.168311
apache-2.0
55
12.879
true
true
false
true
0.92739
0.418781
41.878106
0.420194
19.176907
0.066465
6.646526
0.298658
6.487696
0.397625
8.303125
0.284658
20.517509
false
false
2024-02-26
2024-06-27
0
zhengr/MixTAO-7Bx2-MoE-v8.1