eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
6 values
T
stringclasses
6 values
Weight type
stringclasses
2 values
Architecture
stringclasses
52 values
Model
stringlengths
355
689
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
1.03
52
Hub License
stringclasses
26 values
Hub ❤️
int64
0
5.9k
#Params (B)
int64
-1
140
Available on the hub
bool
2 classes
MoE
bool
2 classes
Flagged
bool
2 classes
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.03
107
IFEval Raw
float64
0
0.9
IFEval
float64
0
90
BBH Raw
float64
0.27
0.75
BBH
float64
0.81
63.5
MATH Lvl 5 Raw
float64
0
0.51
MATH Lvl 5
float64
0
50.7
GPQA Raw
float64
0.22
0.44
GPQA
float64
0
24.9
MUSR Raw
float64
0.29
0.6
MUSR
float64
0
38.5
MMLU-PRO Raw
float64
0.1
0.73
MMLU-PRO
float64
0
70
Merged
bool
2 classes
Official Providers
bool
2 classes
Upload To Hub Date
stringclasses
424 values
Submission Date
stringclasses
169 values
Generation
int64
0
10
Base Model
stringlengths
4
102
gmonsoon_StockSeaLLMs-7B-v1_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/gmonsoon/StockSeaLLMs-7B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gmonsoon/StockSeaLLMs-7B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/gmonsoon__StockSeaLLMs-7B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gmonsoon/StockSeaLLMs-7B-v1
2431fe5e4a3f63984c2936cf1cf68b3c7172cc20
24.753571
0
7
false
false
false
true
0.69689
0.459922
45.99219
0.527109
34.012625
0.175982
17.598187
0.302852
7.04698
0.421375
11.071875
0.395196
32.799572
false
false
2024-11-20
2024-11-20
1
gmonsoon/StockSeaLLMs-7B-v1 (Merge)
gmonsoon_gemma2-9b-sahabatai-v1-instruct-BaseTIES_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/gmonsoon/gemma2-9b-sahabatai-v1-instruct-BaseTIES" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gmonsoon/gemma2-9b-sahabatai-v1-instruct-BaseTIES</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/gmonsoon__gemma2-9b-sahabatai-v1-instruct-BaseTIES-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gmonsoon/gemma2-9b-sahabatai-v1-instruct-BaseTIES
43296081051afe5d7a426b86a6d73104efab440b
33.703864
gemma
1
9
true
false
false
true
1.781485
0.737792
73.779239
0.607724
43.401342
0.193353
19.335347
0.32047
9.395973
0.477802
19.12526
0.434674
37.186022
true
false
2024-11-16
2024-11-17
1
gmonsoon/gemma2-9b-sahabatai-v1-instruct-BaseTIES (Merge)
google_codegemma-1.1-2b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/google/codegemma-1.1-2b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/codegemma-1.1-2b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__codegemma-1.1-2b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/codegemma-1.1-2b
9d69e500da236427eab5867552ffc87108964f4d
7.033163
gemma
17
2
true
false
false
false
0.949883
0.229363
22.936254
0.335342
7.551225
0.006798
0.679758
0.265101
2.013423
0.387146
5.926563
0.127826
3.091755
false
true
2024-04-30
2024-08-12
0
google/codegemma-1.1-2b
google_flan-t5-base_float16
float16
🟢 pretrained
🟢
Original
T5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/google/flan-t5-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/flan-t5-base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__flan-t5-base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/flan-t5-base
7bcac572ce56db69c1ea7c8af255c5d7c9672fc2
6.239408
apache-2.0
816
0
true
false
false
false
0.156621
0.189071
18.907056
0.352598
11.337694
0
0
0.238255
0
0.367115
3.222656
0.135721
3.969046
false
true
2022-10-21
2024-08-14
0
google/flan-t5-base
google_flan-t5-large_float16
float16
🟢 pretrained
🟢
Original
T5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/google/flan-t5-large" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/flan-t5-large</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__flan-t5-large-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/flan-t5-large
0613663d0d48ea86ba8cb3d7a44f0f65dc596a2a
9.418949
apache-2.0
636
0
true
false
false
false
0.233491
0.220095
22.00949
0.415312
17.510018
0
0
0.250839
0.111857
0.408323
9.007031
0.170878
7.875296
false
true
2022-10-21
2024-08-14
0
google/flan-t5-large
google_flan-t5-small_float16
float16
🟢 pretrained
🟢
Original
T5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/google/flan-t5-small" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/flan-t5-small</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__flan-t5-small-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/flan-t5-small
0fc9ddf78a1e988dac52e2dac162b0ede4fd74ab
6.003781
apache-2.0
284
0
true
false
false
false
0.14313
0.152426
15.242556
0.32829
6.363112
0
0
0.260906
1.454139
0.412292
10.369792
0.123338
2.593085
false
true
2022-10-21
2024-06-27
0
google/flan-t5-small
google_flan-t5-xl_float16
float16
🟢 pretrained
🟢
Original
T5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/google/flan-t5-xl" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/flan-t5-xl</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__flan-t5-xl-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/flan-t5-xl
7d6315df2c2fb742f0f5b556879d730926ca9001
11.59178
apache-2.0
473
2
true
false
false
false
0.348929
0.223742
22.374189
0.453106
22.695056
0.000755
0.075529
0.252517
0.33557
0.418094
11.328385
0.214678
12.741947
false
true
2022-10-21
2024-08-07
0
google/flan-t5-xl
google_flan-t5-xl_bfloat16
bfloat16
🟢 pretrained
🟢
Original
T5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/google/flan-t5-xl" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/flan-t5-xl</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__flan-t5-xl-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/flan-t5-xl
7d6315df2c2fb742f0f5b556879d730926ca9001
11.587167
apache-2.0
473
2
true
false
false
false
0.285352
0.220694
22.069442
0.453722
22.837588
0.000755
0.075529
0.245805
0
0.422031
11.853906
0.214179
12.68654
false
true
2022-10-21
2024-08-07
0
google/flan-t5-xl
google_flan-t5-xxl_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
T5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/google/flan-t5-xxl" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/flan-t5-xxl</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__flan-t5-xxl-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/flan-t5-xxl
ae7c9136adc7555eeccc78cdd960dfd60fb346ce
13.485843
apache-2.0
1,211
11
true
false
false
false
0.706477
0.220045
22.004504
0.506589
30.119256
0
0
0.270134
2.684564
0.42175
11.185417
0.234292
14.921321
false
true
2022-10-21
2024-09-06
0
google/flan-t5-xxl
google_flan-ul2_bfloat16
bfloat16
🟢 pretrained
🟢
Original
T5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/google/flan-ul2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/flan-ul2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__flan-ul2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/flan-ul2
452d74ce28ac4a7f211d6ba3ef0717027f7a8074
13.550118
apache-2.0
554
19
true
false
false
false
0.559966
0.239254
23.925407
0.505374
30.02029
0.001511
0.151057
0.287752
5.033557
0.384354
5.577604
0.249335
16.59279
false
true
2023-03-03
2024-08-07
0
google/flan-ul2
google_gemma-1.1-2b-it_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/google/gemma-1.1-2b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/gemma-1.1-2b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__gemma-1.1-2b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/gemma-1.1-2b-it
bf4924f313df5166dee1467161e886e55f2eb4d4
7.776435
gemma
152
2
true
false
false
true
0.329215
0.306748
30.674832
0.318463
5.862827
0.001511
0.151057
0.269295
2.572707
0.339396
2.024479
0.148354
5.37271
false
true
2024-03-26
2024-06-12
0
google/gemma-1.1-2b-it
google_gemma-1.1-7b-it_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/google/gemma-1.1-7b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/gemma-1.1-7b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__gemma-1.1-7b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/gemma-1.1-7b-it
16128b0aeb50762ea96430c0c06a37941bf9f274
17.479586
gemma
266
8
true
false
false
true
0.578299
0.503911
50.391073
0.39353
15.934209
0.036254
3.625378
0.293624
5.816555
0.423021
11.510938
0.258394
17.599365
false
true
2024-03-26
2024-06-12
0
google/gemma-1.1-7b-it
google_gemma-2-27b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/google/gemma-2-27b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/gemma-2-27b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__gemma-2-27b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/gemma-2-27b
938270f5272feb02779b55c2bb2fffdd0f53ff0c
23.850639
gemma
186
27
true
false
false
false
5.614249
0.247522
24.752213
0.564291
37.390737
0.161631
16.163142
0.350671
13.422819
0.439635
13.921094
0.437084
37.453827
false
true
2024-06-24
2024-08-24
0
google/gemma-2-27b
google_gemma-2-27b-it_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/google/gemma-2-27b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/gemma-2-27b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__gemma-2-27b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/gemma-2-27b-it
f6c533e5eb013c7e31fc74ef042ac4f3fb5cf40b
32.322319
gemma
459
27
true
false
false
true
4.826211
0.797768
79.77677
0.645139
49.272842
0.007553
0.755287
0.375
16.666667
0.403302
9.11276
0.445146
38.349586
false
true
2024-06-24
2024-08-07
1
google/gemma-2-27b
google_gemma-2-2b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
InternLM2ForCausalLM
<a target="_blank" href="https://huggingface.co/google/gemma-2-2b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/gemma-2-2b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__gemma-2-2b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/gemma-2-2b
4d05c88d00441bf62bf87dcfd29e204c05089f36
10.129463
gemma
450
2
true
false
false
true
1.518796
0.199312
19.931227
0.365597
11.755808
0.028701
2.870091
0.262584
1.677852
0.423177
11.430469
0.218002
13.111333
false
true
2024-07-16
2024-07-31
0
google/gemma-2-2b
google_gemma-2-2b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/google/gemma-2-2b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/gemma-2-2b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__gemma-2-2b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/gemma-2-2b
0738188b3055bc98daf0fe7211f0091357e5b979
10.334439
gemma
450
2
true
false
false
false
1.418257
0.20176
20.176022
0.370867
12.497306
0.028701
2.870091
0.262584
1.677852
0.421875
11.267708
0.221659
13.517657
false
true
2024-07-16
2024-08-04
0
google/gemma-2-2b
google_gemma-2-2b-it_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
InternLM2ForCausalLM
<a target="_blank" href="https://huggingface.co/google/gemma-2-2b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/gemma-2-2b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__gemma-2-2b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/gemma-2-2b-it
2b6ac3ff954ad896c115bbfa1b571cd93ea2c20f
17.046939
gemma
760
2
true
false
false
true
1.234743
0.566834
56.683378
0.419923
17.980793
0.000755
0.075529
0.274329
3.243848
0.392885
7.077344
0.254987
17.220745
false
true
2024-07-16
2024-07-31
1
google/gemma-2-2b
google_gemma-2-2b-jpn-it_float16
float16
🟢 pretrained
🟢
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/google/gemma-2-2b-jpn-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/gemma-2-2b-jpn-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__gemma-2-2b-jpn-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/gemma-2-2b-jpn-it
6b046bbc091084a1ec89fe03e58871fde10868eb
17.115406
gemma
143
2
true
false
false
false
1.011437
0.507783
50.778268
0.422557
18.525626
0.034743
3.47432
0.285235
4.697987
0.396385
7.68151
0.257813
17.534722
false
true
2024-09-25
2024-10-11
2
google/gemma-2-2b
google_gemma-2-2b-jpn-it_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/google/gemma-2-2b-jpn-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/gemma-2-2b-jpn-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__gemma-2-2b-jpn-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/gemma-2-2b-jpn-it
6b046bbc091084a1ec89fe03e58871fde10868eb
15.885579
gemma
143
2
true
false
false
true
0.8544
0.52884
52.884014
0.417844
17.848086
0
0
0.275168
3.355705
0.37276
4.928385
0.246676
16.297281
false
true
2024-09-25
2024-10-14
2
google/gemma-2-2b
google_gemma-2-9b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/google/gemma-2-9b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/gemma-2-9b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__gemma-2-9b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/gemma-2-9b
beb0c08e9eeb0548f3aca2ac870792825c357b7d
21.154934
gemma
606
9
true
false
false
false
5.663186
0.203983
20.398321
0.537737
34.096819
0.13142
13.141994
0.328859
10.514541
0.446115
14.297656
0.410322
34.480275
false
true
2024-06-24
2024-07-11
0
google/gemma-2-9b
google_gemma-2-9b-it_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/google/gemma-2-9b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/gemma-2-9b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__gemma-2-9b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/gemma-2-9b-it
1937c70277fcc5f7fb0fc772fc5bc69378996e71
28.86279
gemma
579
9
true
false
false
true
5.014497
0.743563
74.356264
0.599034
42.13662
0.002266
0.226586
0.360738
14.765101
0.407271
9.742188
0.38755
31.949985
false
true
2024-06-24
2024-07-11
1
google/gemma-2-9b
google_gemma-2b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/google/gemma-2b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/gemma-2b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__gemma-2b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/gemma-2b
2ac59a5d7bf4e1425010f0d457dde7d146658953
7.358701
gemma
917
2
true
false
false
false
1.236251
0.203758
20.375825
0.338099
8.466713
0.030211
3.021148
0.255034
0.671141
0.397781
7.55599
0.136553
4.061392
false
true
2024-02-08
2024-06-12
0
google/gemma-2b
google_gemma-2b-it_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/google/gemma-2b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/gemma-2b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__gemma-2b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/gemma-2b-it
de144fb2268dee1066f515465df532c05e699d48
7.221454
gemma
683
2
true
false
false
true
0.35295
0.26903
26.902951
0.315082
5.214303
0.004532
0.453172
0.278523
3.803132
0.334125
3.032292
0.135306
3.922872
false
true
2024-02-08
2024-06-12
0
google/gemma-2b-it
google_gemma-7b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/google/gemma-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/gemma-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__gemma-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/gemma-7b
a0eac5b80dba224e6ed79d306df50b1e92c2125d
15.455407
gemma
3,071
8
true
false
false
false
1.254914
0.265932
26.593217
0.436153
21.116099
0.074773
7.477341
0.286913
4.9217
0.40624
10.979948
0.294797
21.644134
false
true
2024-02-08
2024-06-08
0
google/gemma-7b
google_gemma-7b-it_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/google/gemma-7b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/gemma-7b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__gemma-7b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/gemma-7b-it
18329f019fb74ca4b24f97371785268543d687d2
12.868142
gemma
1,139
8
true
false
false
true
1.099954
0.386832
38.683249
0.364558
11.880091
0.018127
1.812689
0.284396
4.58613
0.427427
12.528385
0.169465
7.718307
false
true
2024-02-13
2024-06-12
1
google/gemma-7b
google_mt5-base_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MT5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/google/mt5-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/mt5-base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__mt5-base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/mt5-base
2eb15465c5dd7f72a8f7984306ad05ebc3dd1e1f
3.565282
apache-2.0
195
0
true
false
false
false
0.20004
0.164516
16.451571
0.288316
1.298551
0
0
0.239094
0
0.367208
2.867708
0.106965
0.773862
false
true
2022-03-02
2024-09-06
0
google/mt5-base
google_mt5-small_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MT5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/google/mt5-small" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/mt5-small</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__mt5-small-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/mt5-small
73fb5dbe4756edadc8fbe8c769b0a109493acf7a
4.255928
apache-2.0
115
0
true
false
false
false
0.180494
0.17181
17.180969
0.276584
1.070971
0
0
0.24245
0
0.38575
5.91875
0.112284
1.364879
false
true
2022-03-02
2024-09-06
0
google/mt5-small
google_mt5-xl_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MT5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/google/mt5-xl" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/mt5-xl</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__mt5-xl-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/mt5-xl
63fc6450d80515b48e026b69ef2fbbd426433e84
5.19142
apache-2.0
22
3
true
false
false
false
0.903767
0.195964
19.596449
0.304736
3.282462
0
0
0.264262
1.901566
0.379521
5.040104
0.111951
1.32794
false
true
2022-03-02
2024-09-06
0
google/mt5-xl
google_mt5-xxl_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
T5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/google/mt5-xxl" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/mt5-xxl</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__mt5-xxl-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/mt5-xxl
e07c395916dfbc315d4e5e48b4a54a1e8821b5c0
5.103077
apache-2.0
67
11
true
false
false
false
2.281939
0.235757
23.575668
0.295934
2.504711
0
0
0.241611
0
0.368948
3.551823
0.108876
0.986259
false
true
2022-03-02
2024-09-06
0
google/mt5-xxl
google_recurrentgemma-2b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
RecurrentGemmaForCausalLM
<a target="_blank" href="https://huggingface.co/google/recurrentgemma-2b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/recurrentgemma-2b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__recurrentgemma-2b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/recurrentgemma-2b
195f13c55b371fc721eda0662c00c64642c70e17
6.952186
gemma
91
2
true
false
false
false
3.692653
0.301703
30.170282
0.319736
4.820362
0.016616
1.661631
0.245805
0
0.344573
3.104948
0.117603
1.955895
false
true
2024-04-06
2024-06-13
0
google/recurrentgemma-2b
google_recurrentgemma-2b-it_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
RecurrentGemmaForCausalLM
<a target="_blank" href="https://huggingface.co/google/recurrentgemma-2b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/recurrentgemma-2b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__recurrentgemma-2b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/recurrentgemma-2b-it
150248167d171fbdf4b02e7d28a4b3d749e570f6
7.945553
gemma
109
2
true
false
false
true
1.933036
0.294933
29.4933
0.333
7.978764
0.016616
1.661631
0.253356
0.447427
0.334063
3.624479
0.140209
4.467716
false
true
2024-04-08
2024-06-12
0
google/recurrentgemma-2b-it
google_recurrentgemma-9b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
RecurrentGemmaForCausalLM
<a target="_blank" href="https://huggingface.co/google/recurrentgemma-9b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/recurrentgemma-9b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__recurrentgemma-9b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/recurrentgemma-9b
7b0ed98fb889ba8bdfa7c690f08f2e57a7c48dae
13.684285
gemma
59
9
true
false
false
false
23.20619
0.311594
31.159435
0.395626
15.323369
0.064955
6.495468
0.285235
4.697987
0.38026
6.599219
0.260472
17.83023
false
true
2024-06-07
2024-07-04
0
google/recurrentgemma-9b
google_recurrentgemma-9b-it_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
RecurrentGemmaForCausalLM
<a target="_blank" href="https://huggingface.co/google/recurrentgemma-9b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/recurrentgemma-9b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__recurrentgemma-9b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/recurrentgemma-9b-it
43e62f98c3d496a5469ef4b18c1b11e417d68d1d
19.230703
gemma
50
9
true
false
false
true
13.362608
0.501038
50.103836
0.436719
21.62158
0.067221
6.722054
0.270134
2.684564
0.437906
13.771615
0.284325
20.48057
false
true
2024-06-07
2024-07-05
0
google/recurrentgemma-9b-it
google_switch-base-8_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
SwitchTransformersForConditionalGeneration
<a target="_blank" href="https://huggingface.co/google/switch-base-8" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/switch-base-8</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__switch-base-8-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/switch-base-8
92fe2d22b024d9937146fe097ba3d3a7ba146e1b
3.29595
apache-2.0
15
0
true
false
false
false
0.146703
0.158521
15.85205
0.287631
1.702478
0
0
0.25
0
0.35174
1.133333
0.109791
1.08784
false
true
2022-10-24
2024-09-06
0
google/switch-base-8
google_umt5-base_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
UMT5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/google/umt5-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/umt5-base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__umt5-base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/umt5-base
0de9394d54f8975e71838d309de1cb496c894ab9
3.441046
apache-2.0
13
-1
true
false
false
false
0.668046
0.174632
17.46322
0.278773
0.813553
0
0
0.254195
0.559284
0.338219
0.94401
0.107796
0.866209
false
true
2023-07-02
2024-09-06
0
google/umt5-base
goulue5_merging_LLM_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/goulue5/merging_LLM" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">goulue5/merging_LLM</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/goulue5__merging_LLM-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
goulue5/merging_LLM
587115b34d72ef957fee2d8348b3ade3ae06d4a8
16.409985
0
1
false
false
false
false
0.551465
0.32326
32.326006
0.42165
18.28283
0.07855
7.854985
0.291107
5.480984
0.433281
12.760156
0.295795
21.75495
false
false
2024-11-21
2024-11-22
0
goulue5/merging_LLM
gpt2_bfloat16
bfloat16
🟢 pretrained
🟢
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/gpt2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gpt2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/gpt2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gpt2
607a30d783dfa663caf39e06633721c8d4cfcd7e
6.39103
mit
2,423
0
true
false
false
false
0.323928
0.193417
19.34168
0.303639
2.714298
0.003021
0.302115
0.260067
1.342282
0.432417
12.985417
0.114943
1.660387
false
true
2022-03-02
2024-06-26
0
gpt2
gpt2_float16
float16
🟢 pretrained
🟢
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/gpt2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gpt2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/gpt2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gpt2
607a30d783dfa663caf39e06633721c8d4cfcd7e
5.977737
mit
2,423
0
true
false
false
false
0.039245
0.083333
8.333333
0.308333
9.199755
0
0
0.233333
0
0.433333
18.333333
0.1
0
false
true
2022-03-02
2024-06-26
0
gpt2
gradientai_Llama-3-8B-Instruct-Gradient-1048k_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/gradientai/Llama-3-8B-Instruct-Gradient-1048k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gradientai/Llama-3-8B-Instruct-Gradient-1048k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/gradientai__Llama-3-8B-Instruct-Gradient-1048k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gradientai/Llama-3-8B-Instruct-Gradient-1048k
8697fb25cb77c852311e03b4464b8467471d56a4
18.24557
llama3
676
8
true
false
false
true
0.887164
0.445559
44.555889
0.43459
21.010529
0.05136
5.135952
0.277685
3.691275
0.42975
13.51875
0.294049
21.561022
false
true
2024-04-29
2024-06-12
0
gradientai/Llama-3-8B-Instruct-Gradient-1048k
grimjim_Llama-3-Instruct-8B-SPPO-Iter3-SimPO-merge_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/Llama-3-Instruct-8B-SPPO-Iter3-SimPO-merge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/Llama-3-Instruct-8B-SPPO-Iter3-SimPO-merge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/grimjim__Llama-3-Instruct-8B-SPPO-Iter3-SimPO-merge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/Llama-3-Instruct-8B-SPPO-Iter3-SimPO-merge
7a8d334dce0a2ce948f75612b8d3a61c53d094aa
20.887036
llama3
2
8
true
false
false
false
0.547548
0.427124
42.712447
0.496169
28.258015
0.102719
10.271903
0.290268
5.369128
0.404323
9.540365
0.362533
29.170361
true
false
2024-06-28
2024-06-29
1
grimjim/Llama-3-Instruct-8B-SPPO-Iter3-SimPO-merge (Merge)
grimjim_Llama-3-Instruct-8B-SimPO-SPPO-Iter3-merge_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/Llama-3-Instruct-8B-SimPO-SPPO-Iter3-merge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/Llama-3-Instruct-8B-SimPO-SPPO-Iter3-merge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/grimjim__Llama-3-Instruct-8B-SimPO-SPPO-Iter3-merge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/Llama-3-Instruct-8B-SimPO-SPPO-Iter3-merge
8f4d460ea20e24e48914156af7def305c0cd347f
23.688475
llama3
2
8
true
false
false
true
0.616942
0.68059
68.058972
0.502173
29.073286
0.067976
6.797583
0.262584
1.677852
0.38851
6.697135
0.368434
29.82602
true
false
2024-06-28
2024-09-17
1
grimjim/Llama-3-Instruct-8B-SimPO-SPPO-Iter3-merge (Merge)
grimjim_Llama-3.1-8B-Instruct-abliterated_via_adapter_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/Llama-3.1-8B-Instruct-abliterated_via_adapter" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/Llama-3.1-8B-Instruct-abliterated_via_adapter</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/grimjim__Llama-3.1-8B-Instruct-abliterated_via_adapter-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/Llama-3.1-8B-Instruct-abliterated_via_adapter
b37ab2f859c96b125ff1c45c7ff0e267aa229156
23.179537
llama3.1
28
8
true
false
false
false
0.901915
0.48695
48.695018
0.510527
29.41599
0.137462
13.746224
0.313758
8.501119
0.401031
9.26224
0.36511
29.456634
true
false
2024-07-25
2024-09-17
1
grimjim/Llama-3.1-8B-Instruct-abliterated_via_adapter (Merge)
grimjim_Magot-v1-Gemma2-8k-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/Magot-v1-Gemma2-8k-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/Magot-v1-Gemma2-8k-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/grimjim__Magot-v1-Gemma2-8k-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/Magot-v1-Gemma2-8k-9B
afae94acb42bc0dcf1d31b7338cb79c0bcab1829
23.706235
gemma
2
9
true
false
false
false
2.954037
0.299678
29.967819
0.601945
42.818128
0.046073
4.607251
0.346477
12.863535
0.448844
14.905469
0.433677
37.075207
true
false
2024-09-09
2024-09-19
1
grimjim/Magot-v1-Gemma2-8k-9B (Merge)
grimjim_llama-3-Nephilim-v1-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/llama-3-Nephilim-v1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/llama-3-Nephilim-v1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/grimjim__llama-3-Nephilim-v1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/llama-3-Nephilim-v1-8B
642799c8c768c53e831a03a1224db875116be866
21.742325
cc-by-nc-4.0
1
8
true
false
false
false
0.856873
0.427724
42.772399
0.513182
29.907537
0.09139
9.138973
0.302013
6.935123
0.413625
10.636458
0.379571
31.06346
true
false
2024-06-21
2024-06-26
1
grimjim/llama-3-Nephilim-v1-8B (Merge)
grimjim_llama-3-Nephilim-v2-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/llama-3-Nephilim-v2-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/llama-3-Nephilim-v2-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/grimjim__llama-3-Nephilim-v2-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/llama-3-Nephilim-v2-8B
924f56cdefbfaf38deb6aee3ad301ced027e142d
20.587662
cc-by-nc-4.0
1
8
true
false
false
false
0.700873
0.392228
39.222818
0.504821
29.896264
0.10574
10.574018
0.299497
6.599553
0.3895
7.8875
0.364112
29.345819
true
false
2024-06-26
2024-09-18
1
grimjim/llama-3-Nephilim-v2-8B (Merge)
grimjim_llama-3-Nephilim-v2.1-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/llama-3-Nephilim-v2.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/llama-3-Nephilim-v2.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/grimjim__llama-3-Nephilim-v2.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/llama-3-Nephilim-v2.1-8B
5f516d9df1778dbe53ea941a754aef73b87e8eaa
20.447555
cc-by-nc-4.0
1
8
true
false
false
false
0.713158
0.389505
38.95054
0.509504
29.819664
0.100453
10.045317
0.299497
6.599553
0.3935
7.8875
0.364445
29.382757
true
false
2024-07-09
2024-09-18
1
grimjim/llama-3-Nephilim-v2.1-8B (Merge)
grimjim_llama-3-Nephilim-v3-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/llama-3-Nephilim-v3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/llama-3-Nephilim-v3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/grimjim__llama-3-Nephilim-v3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/llama-3-Nephilim-v3-8B
fd012ba05116aad7dc297d0a866ddb3345a056a1
20.663929
cc-by-nc-4.0
11
8
true
false
false
false
0.56409
0.417383
41.738254
0.501267
28.955635
0.098943
9.89426
0.295302
6.040268
0.398927
8.332552
0.361203
29.022606
true
false
2024-07-14
2024-08-26
1
grimjim/llama-3-Nephilim-v3-8B (Merge)
gupta-tanish_llama-7b-dpo-baseline_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/gupta-tanish/llama-7b-dpo-baseline" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gupta-tanish/llama-7b-dpo-baseline</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/gupta-tanish__llama-7b-dpo-baseline-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gupta-tanish/llama-7b-dpo-baseline
1b5f1ef3ffa3b550619fbf64c33b6fd79e1bd559
11.85729
apache-2.0
0
6
true
false
false
false
0.781316
0.269304
26.930433
0.389689
14.380522
0.019637
1.963746
0.262584
1.677852
0.445625
14.769792
0.202793
11.421395
false
false
2024-09-29
2024-09-29
1
gupta-tanish/llama-7b-dpo-baseline (Merge)
h2oai_h2o-danube3-4b-base_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/h2oai/h2o-danube3-4b-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">h2oai/h2o-danube3-4b-base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/h2oai__h2o-danube3-4b-base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
h2oai/h2o-danube3-4b-base
6bdf2f1e317143c998b88d9e9d72facc621a863f
10.01532
apache-2.0
21
3
true
false
false
false
0.444502
0.233809
23.380852
0.359908
10.564444
0.018127
1.812689
0.291107
5.480984
0.377813
6.526563
0.210938
12.326389
false
false
2024-07-04
2024-08-10
0
h2oai/h2o-danube3-4b-base
h2oai_h2o-danube3-4b-chat_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/h2oai/h2o-danube3-4b-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">h2oai/h2o-danube3-4b-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/h2oai__h2o-danube3-4b-chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
h2oai/h2o-danube3-4b-chat
1e5c6fa6620f8bf078958069ab4581cd88e0202c
11.395014
apache-2.0
65
3
true
false
false
true
0.462621
0.362877
36.287717
0.346617
8.839703
0.030211
3.021148
0.260067
1.342282
0.378125
5.232292
0.222822
13.646941
false
false
2024-07-04
2024-07-15
0
h2oai/h2o-danube3-4b-chat
h2oai_h2o-danube3-500m-chat_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/h2oai/h2o-danube3-500m-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">h2oai/h2o-danube3-500m-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/h2oai__h2o-danube3-500m-chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
h2oai/h2o-danube3-500m-chat
c202f976c26875541e738ea978c8158fa536da9a
5.028207
apache-2.0
31
0
true
false
false
true
0.218903
0.220794
22.079416
0.303469
3.06537
0.006042
0.60423
0.230705
0
0.343396
2.824479
0.114362
1.595745
false
false
2024-07-04
2024-10-11
0
h2oai/h2o-danube3-500m-chat
h2oai_h2o-danube3.1-4b-chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/h2oai/h2o-danube3.1-4b-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">h2oai/h2o-danube3.1-4b-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/h2oai__h2o-danube3.1-4b-chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
h2oai/h2o-danube3.1-4b-chat
e649b5c5844432e0b3e1b1102b6218604e6cbdb8
16.210718
apache-2.0
1
3
true
false
false
true
0.299141
0.502112
50.211217
0.360842
10.942063
0.021148
2.114804
0.285235
4.697987
0.410156
10.202865
0.271858
19.095375
false
false
2024-11-29
2024-11-29
0
h2oai/h2o-danube3.1-4b-chat
haoranxu_ALMA-13B-R_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Adapter
?
<a target="_blank" href="https://huggingface.co/haoranxu/ALMA-13B-R" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">haoranxu/ALMA-13B-R</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/haoranxu__ALMA-13B-R-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
haoranxu/ALMA-13B-R
b69ebad694274b929cfcf3db29dd7bb93d752e39
3.587775
mit
78
13
true
false
false
false
0.96263
0.003922
0.392182
0.345656
8.819669
0
0
0.25755
1.006711
0.352792
2.232292
0.181682
9.075798
false
false
2024-01-17
2024-10-01
0
haoranxu/ALMA-13B-R
haoranxu_Llama-3-Instruct-8B-CPO-SimPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/haoranxu/Llama-3-Instruct-8B-CPO-SimPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">haoranxu/Llama-3-Instruct-8B-CPO-SimPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/haoranxu__Llama-3-Instruct-8B-CPO-SimPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
haoranxu/Llama-3-Instruct-8B-CPO-SimPO
3ca4b5c3a6395ff090e1039d55ac1f6120777302
24.570858
mit
1
8
true
false
false
true
0.745335
0.704645
70.464479
0.50483
29.762188
0.082326
8.232628
0.292785
5.704698
0.356667
3.416667
0.3686
29.844489
false
false
2024-06-19
2024-07-28
0
haoranxu/Llama-3-Instruct-8B-CPO-SimPO
haoranxu_Llama-3-Instruct-8B-SimPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/haoranxu/Llama-3-Instruct-8B-SimPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">haoranxu/Llama-3-Instruct-8B-SimPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/haoranxu__Llama-3-Instruct-8B-SimPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
haoranxu/Llama-3-Instruct-8B-SimPO
8346770280fa169d41d737785dd63a66e9d94501
24.827084
llama3
1
8
true
false
false
true
0.579578
0.734745
73.474492
0.497924
28.226376
0.077795
7.779456
0.290268
5.369128
0.356604
3.742188
0.373338
30.370863
false
false
2024-06-07
2024-07-28
1
meta-llama/Meta-Llama-3-8B-Instruct
hon9kon9ize_CantoneseLLMChat-v0.5_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hon9kon9ize/CantoneseLLMChat-v0.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hon9kon9ize/CantoneseLLMChat-v0.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/hon9kon9ize__CantoneseLLMChat-v0.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hon9kon9ize/CantoneseLLMChat-v0.5
812eb4f168c3ea258ebb220393401db9578e0f67
15.783567
apache-2.0
9
6
true
false
false
false
0.833634
0.323085
32.308497
0.434524
20.761385
0.030967
3.096677
0.277685
3.691275
0.470646
18.130729
0.250416
16.71284
false
false
2024-07-01
2024-07-07
0
hon9kon9ize/CantoneseLLMChat-v0.5
hon9kon9ize_CantoneseLLMChat-v1.0-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/hon9kon9ize/CantoneseLLMChat-v1.0-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hon9kon9ize/CantoneseLLMChat-v1.0-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/hon9kon9ize__CantoneseLLMChat-v1.0-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hon9kon9ize/CantoneseLLMChat-v1.0-7B
4703b1afc7aab8e3a8059432fd1c4b0aba011482
23.252108
other
2
7
true
false
false
true
1.831396
0.445484
44.548354
0.486573
28.536136
0.195619
19.561934
0.322148
9.619687
0.388292
6.303125
0.378491
30.94341
false
false
2024-10-02
2024-10-10
1
Removed
hotmailuser_Gemma2Crono-27B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/hotmailuser/Gemma2Crono-27B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hotmailuser/Gemma2Crono-27B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/hotmailuser__Gemma2Crono-27B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hotmailuser/Gemma2Crono-27B
68feccf9af840291c9ce4dea83bdd7b68c351f45
36.200632
apache-2.0
0
27
true
false
false
false
3.887914
0.708616
70.861647
0.650534
50.103412
0.23716
23.716012
0.370805
16.107383
0.456687
16.052604
0.463265
40.362736
true
false
2024-12-02
2024-12-02
1
hotmailuser/Gemma2Crono-27B (Merge)
hotmailuser_Gemma2SimPO-27B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/hotmailuser/Gemma2SimPO-27B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hotmailuser/Gemma2SimPO-27B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/hotmailuser__Gemma2SimPO-27B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hotmailuser/Gemma2SimPO-27B
59d5de8216b2b53abcf56a79ebb630d17a856d00
35.388021
0
27
false
false
false
false
4.464646
0.72223
72.223035
0.641316
49.159219
0.219033
21.903323
0.358221
14.42953
0.444656
14.148698
0.464179
40.464317
false
false
2024-12-01
0
Removed
hotmailuser_Gemma2atlas-27B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/hotmailuser/Gemma2atlas-27B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hotmailuser/Gemma2atlas-27B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/hotmailuser__Gemma2atlas-27B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hotmailuser/Gemma2atlas-27B
8021ee95d2515abefc6b15924f9f49e2e98b88b8
35.771827
apache-2.0
0
27
true
false
false
false
4.26537
0.721356
72.1356
0.654496
50.713279
0.212236
21.223565
0.355705
14.09396
0.444531
14.79974
0.474983
41.66482
true
false
2024-12-01
2024-12-01
1
hotmailuser/Gemma2atlas-27B (Merge)
hotmailuser_Gemma2magnum-27b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/hotmailuser/Gemma2magnum-27b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hotmailuser/Gemma2magnum-27b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/hotmailuser__Gemma2magnum-27b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hotmailuser/Gemma2magnum-27b
c2beed3653f3732b0af82a9dd1cddd5919c9c686
32.353904
0
27
false
false
false
false
4.218189
0.50506
50.505991
0.619959
46.101146
0.213746
21.374622
0.385067
18.008949
0.472344
18.176302
0.459608
39.956413
false
false
2024-12-02
0
Removed
hotmailuser_Qwen2.5-HomerSlerp-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/hotmailuser/Qwen2.5-HomerSlerp-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hotmailuser/Qwen2.5-HomerSlerp-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/hotmailuser__Qwen2.5-HomerSlerp-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hotmailuser/Qwen2.5-HomerSlerp-7B
19fe6a99882323c775e8208e8ebc7e219a80435b
29.34447
apache-2.0
0
7
true
false
false
false
0.600052
0.448781
44.878146
0.563251
37.404119
0.326284
32.628399
0.313758
8.501119
0.438333
13.225
0.45487
39.430038
true
false
2024-12-07
2024-12-07
1
hotmailuser/Qwen2.5-HomerSlerp-7B (Merge)
huggyllama_llama-13b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/huggyllama/llama-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">huggyllama/llama-13b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/huggyllama__llama-13b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
huggyllama/llama-13b
bf57045473f207bb1de1ed035ace226f4d9f9bba
9.291479
other
137
13
true
false
false
false
1.106141
0.241053
24.105263
0.398789
16.145707
0.01435
1.435045
0.255034
0.671141
0.346219
2.810677
0.195229
10.581043
false
false
2023-04-03
2024-07-04
0
huggyllama/llama-13b
huggyllama_llama-65b_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/huggyllama/llama-65b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">huggyllama/llama-65b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/huggyllama__llama-65b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
huggyllama/llama-65b
49707c5313d34d1c5a846e29cf2a2a650c22c8ee
13.587327
other
74
65
true
false
false
false
9.330108
0.252593
25.259312
0.470256
25.254277
0.024924
2.492447
0.276007
3.467562
0.359458
1.965625
0.307763
23.084737
false
false
2023-04-04
2024-06-26
0
huggyllama/llama-65b
huggyllama_llama-7b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/huggyllama/llama-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">huggyllama/llama-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/huggyllama__llama-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
huggyllama/llama-7b
4782ad278652c7c71b72204d462d6d01eaaf7549
6.389824
other
303
6
true
false
false
false
0.563604
0.250095
25.00953
0.327731
7.076661
0.006798
0.679758
0.252517
0.33557
0.335396
1.757812
0.131316
3.47961
false
false
2023-04-03
2024-07-04
0
huggyllama/llama-7b
huihui-ai_QwQ-32B-Coder-Fusion-9010_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/huihui-ai/QwQ-32B-Coder-Fusion-9010" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">huihui-ai/QwQ-32B-Coder-Fusion-9010</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/huihui-ai__QwQ-32B-Coder-Fusion-9010-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
huihui-ai/QwQ-32B-Coder-Fusion-9010
6d19e2749fabb24efe732a2614e7458d61d92426
39.429371
apache-2.0
3
32
true
false
false
true
11.150977
0.577825
57.782462
0.672741
53.023418
0.402568
40.256798
0.361577
14.876957
0.468198
19.52474
0.560007
51.11185
false
false
2024-11-29
2024-12-07
1
huihui-ai/QwQ-32B-Coder-Fusion-9010 (Merge)
huihui-ai_Qwen2.5-14B-Instruct-abliterated-v2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/huihui-ai/Qwen2.5-14B-Instruct-abliterated-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">huihui-ai/Qwen2.5-14B-Instruct-abliterated-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/huihui-ai__Qwen2.5-14B-Instruct-abliterated-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
huihui-ai/Qwen2.5-14B-Instruct-abliterated-v2
68f298d4017b8999dc963fbc560b02eaefa41de3
32.91122
apache-2.0
9
14
true
false
false
true
1.623983
0.832764
83.276373
0.632382
47.406188
0
0
0.333893
11.185682
0.421969
11.579427
0.496177
44.019651
false
false
2024-10-09
2024-12-07
2
Qwen/Qwen2.5-14B
huihui-ai_Qwen2.5-7B-Instruct-abliterated_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/huihui-ai/Qwen2.5-7B-Instruct-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">huihui-ai/Qwen2.5-7B-Instruct-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/huihui-ai__Qwen2.5-7B-Instruct-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
huihui-ai/Qwen2.5-7B-Instruct-abliterated
c04c14c82962506e2b16f58f9f6b0a2e60a6afde
26.647506
apache-2.0
2
7
true
false
false
true
2.166124
0.754603
75.460334
0.526159
32.886673
0
0
0.315436
8.724832
0.396667
7.483333
0.417969
35.329861
false
false
2024-09-19
2024-09-24
2
Qwen/Qwen2.5-7B
huihui-ai_Qwen2.5-7B-Instruct-abliterated-v2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/huihui-ai/Qwen2.5-7B-Instruct-abliterated-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">huihui-ai/Qwen2.5-7B-Instruct-abliterated-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/huihui-ai__Qwen2.5-7B-Instruct-abliterated-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
huihui-ai/Qwen2.5-7B-Instruct-abliterated-v2
05d179c1108cc2dc1c1a16a8255ac6f57eac5d32
26.999905
apache-2.0
21
7
true
false
false
true
2.21974
0.760648
76.064841
0.537669
34.369627
0
0
0.308725
7.829978
0.398063
8.091146
0.420795
35.643839
false
false
2024-09-22
2024-09-24
2
Qwen/Qwen2.5-7B
iRyanBell_ARC1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/iRyanBell/ARC1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">iRyanBell/ARC1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/iRyanBell__ARC1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
iRyanBell/ARC1
28176c0fb77fa43e1410766faf35d2a2681566e9
19.623911
llama3
1
8
true
false
false
false
0.924664
0.441113
44.111291
0.4903
26.564495
0.066465
6.646526
0.294463
5.928412
0.399052
8.148177
0.337101
26.344563
false
false
2024-05-30
2024-06-26
0
iRyanBell/ARC1
iRyanBell_ARC1-II_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/iRyanBell/ARC1-II" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">iRyanBell/ARC1-II</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/iRyanBell__ARC1-II-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
iRyanBell/ARC1-II
c81076b9bdaac0722b33e411a49b07a296e8fae8
9.320256
llama3
1
8
true
false
false
false
0.895276
0.170836
17.083561
0.338178
7.246229
0.007553
0.755287
0.271812
2.908277
0.491292
20.311458
0.168551
7.616726
false
false
2024-06-12
2024-06-26
0
iRyanBell/ARC1-II
ibivibiv_colossus_120b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ibivibiv/colossus_120b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ibivibiv/colossus_120b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ibivibiv__colossus_120b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ibivibiv/colossus_120b
b4c11f98bd874bfa454a0bb46153335cfb9b06a3
25.377439
apache-2.0
1
117
true
false
false
false
13.752432
0.427599
42.759877
0.606141
44.071498
0.054381
5.438066
0.308725
7.829978
0.473313
19.264062
0.39611
32.901152
false
false
2024-04-12
2024-06-27
0
ibivibiv/colossus_120b
ibivibiv_multimaster-7b-v6_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/ibivibiv/multimaster-7b-v6" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ibivibiv/multimaster-7b-v6</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ibivibiv__multimaster-7b-v6-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ibivibiv/multimaster-7b-v6
7b3bfecb654c86565c65cd510dd1138cb3e75087
21.127533
apache-2.0
1
35
true
false
false
false
2.574681
0.447308
44.730759
0.519352
32.40128
0.058157
5.81571
0.303691
7.158837
0.439573
13.379948
0.309508
23.278664
false
false
2024-02-24
2024-06-28
0
ibivibiv/multimaster-7b-v6
ibm_merlinite-7b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/ibm/merlinite-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ibm/merlinite-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ibm__merlinite-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ibm/merlinite-7b
233d12759d5bb9344231dafdb51310ec19d79c0e
16.763622
apache-2.0
103
7
true
false
false
false
0.550397
0.24987
24.987034
0.500713
29.977248
0.024924
2.492447
0.29698
6.263982
0.441156
13.877865
0.306848
22.983156
false
true
2024-03-02
2024-06-09
1
mistralai/Mistral-7B-v0.1
ibm-granite_granite-3.0-1b-a400m-base_bfloat16
bfloat16
🟢 pretrained
🟢
Original
GraniteForCausalLM
<a target="_blank" href="https://huggingface.co/ibm-granite/granite-3.0-1b-a400m-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ibm-granite/granite-3.0-1b-a400m-base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ibm-granite__granite-3.0-1b-a400m-base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ibm-granite/granite-3.0-1b-a400m-base
8f3d6d6fb24a1d2528f24bad0d2ae3e8fc6f3232
5.917497
apache-2.0
5
1
true
false
false
false
2.476425
0.240403
24.040324
0.322121
6.055008
0.019637
1.963746
0.247483
0
0.336729
1.757812
0.115193
1.688091
false
true
2024-10-03
0
ibm-granite/granite-3.0-1b-a400m-base
ibm-granite_granite-3.0-1b-a400m-instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GraniteForCausalLM
<a target="_blank" href="https://huggingface.co/ibm-granite/granite-3.0-1b-a400m-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ibm-granite/granite-3.0-1b-a400m-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ibm-granite__granite-3.0-1b-a400m-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ibm-granite/granite-3.0-1b-a400m-instruct
acb9675a7d67b8657d9b8105d5cbd5818408293f
8.018876
apache-2.0
19
1
true
false
false
true
2.184326
0.333152
33.315159
0.322395
5.453219
0.024924
2.492447
0.260906
1.454139
0.362281
2.685156
0.124418
2.713135
false
true
2024-10-03
1
ibm-granite/granite-3.0-1b-a400m-instruct (Merge)
ibm-granite_granite-3.0-2b-base_bfloat16
bfloat16
🟢 pretrained
🟢
Original
GraniteForCausalLM
<a target="_blank" href="https://huggingface.co/ibm-granite/granite-3.0-2b-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ibm-granite/granite-3.0-2b-base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ibm-granite__granite-3.0-2b-base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ibm-granite/granite-3.0-2b-base
532f55c03d71a31905c0b825eba4b24fe7f7936b
14.108373
apache-2.0
19
2
true
false
false
false
1.046167
0.387382
38.738215
0.404748
17.56375
0.055136
5.513595
0.280201
4.026846
0.343427
3.461719
0.238115
15.346114
false
true
2024-10-02
0
ibm-granite/granite-3.0-2b-base
ibm-granite_granite-3.0-2b-instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GraniteForCausalLM
<a target="_blank" href="https://huggingface.co/ibm-granite/granite-3.0-2b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ibm-granite/granite-3.0-2b-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ibm-granite__granite-3.0-2b-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ibm-granite/granite-3.0-2b-instruct
342f92f4a0b4d6d83c0b61dc6c122e253a4efebd
18.320566
apache-2.0
42
2
true
false
false
true
1.018948
0.513977
51.397736
0.441198
21.737891
0.087613
8.761329
0.299497
6.599553
0.35149
1.269531
0.281416
20.157358
false
true
2024-10-02
1
ibm-granite/granite-3.0-2b-instruct (Merge)
ibm-granite_granite-3.0-3b-a800m-base_bfloat16
bfloat16
🟢 pretrained
🟢
Original
GraniteForCausalLM
<a target="_blank" href="https://huggingface.co/ibm-granite/granite-3.0-3b-a800m-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ibm-granite/granite-3.0-3b-a800m-base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ibm-granite__granite-3.0-3b-a800m-base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ibm-granite/granite-3.0-3b-a800m-base
0d1d12f91791b25289ef407e39d88f00d1256d10
9.426901
apache-2.0
4
3
true
false
false
false
3.535784
0.273226
27.322615
0.36675
11.348442
0.044562
4.456193
0.251678
0.223714
0.341969
3.31276
0.189079
9.89768
false
true
2024-10-03
0
ibm-granite/granite-3.0-3b-a800m-base
ibm-granite_granite-3.0-3b-a800m-instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GraniteForCausalLM
<a target="_blank" href="https://huggingface.co/ibm-granite/granite-3.0-3b-a800m-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ibm-granite/granite-3.0-3b-a800m-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ibm-granite__granite-3.0-3b-a800m-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ibm-granite/granite-3.0-3b-a800m-instruct
ab0c732243cfd50a601fa393dd46a2c5993746f7
13.66036
apache-2.0
16
3
true
false
false
true
3.076681
0.429822
42.982176
0.375278
13.16301
0.067976
6.797583
0.28104
4.138702
0.348667
2.083333
0.215176
12.797355
false
true
2024-10-03
2024-10-20
1
ibm-granite/granite-3.0-3b-a800m-instruct (Merge)
ibm-granite_granite-3.0-8b-base_bfloat16
bfloat16
🟢 pretrained
🟢
Original
GraniteForCausalLM
<a target="_blank" href="https://huggingface.co/ibm-granite/granite-3.0-8b-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ibm-granite/granite-3.0-8b-base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ibm-granite__granite-3.0-8b-base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ibm-granite/granite-3.0-8b-base
1edd1f646abfcd90ed5d6c0d9711fbb02c947884
21.65316
apache-2.0
22
8
true
false
false
false
1.885656
0.458348
45.834829
0.494376
27.974358
0.098943
9.89426
0.325503
10.067114
0.408135
10.45026
0.331283
25.698138
false
true
2024-10-02
2024-10-20
0
ibm-granite/granite-3.0-8b-base
ibm-granite_granite-3.0-8b-instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GraniteForCausalLM
<a target="_blank" href="https://huggingface.co/ibm-granite/granite-3.0-8b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ibm-granite/granite-3.0-8b-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ibm-granite__granite-3.0-8b-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ibm-granite/granite-3.0-8b-instruct
e0a466fb25b9e07e9c2dc93380a360189700d1f8
23.864033
apache-2.0
184
8
true
false
false
true
1.712993
0.530963
53.09634
0.519187
31.588159
0.132175
13.217523
0.332215
10.961969
0.390063
7.024479
0.345662
27.29573
false
true
2024-10-02
2024-10-20
1
ibm-granite/granite-3.0-8b-instruct (Merge)
ibm-granite_granite-7b-base_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ibm-granite/granite-7b-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ibm-granite/granite-7b-base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ibm-granite__granite-7b-base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ibm-granite/granite-7b-base
23fcb4cb5b69f8a122fb944491e9f1ad664ba37b
7.757645
apache-2.0
26
6
true
false
false
false
0.652624
0.241427
24.142719
0.348044
9.0508
0.006798
0.679758
0.245805
0
0.35549
3.402865
0.183428
9.269725
false
true
2024-04-19
2024-06-12
0
ibm-granite/granite-7b-base
ibm-granite_granite-7b-instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ibm-granite/granite-7b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ibm-granite/granite-7b-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ibm-granite__granite-7b-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ibm-granite/granite-7b-instruct
c6d1adfa5cdba2c8344e055bb7de87b7935250a8
11.808373
apache-2.0
4
6
true
false
false
true
0.711452
0.297231
29.723135
0.372295
12.639329
0.006798
0.679758
0.285235
4.697987
0.402
8.816667
0.22864
14.293366
false
true
2024-05-19
2024-10-02
1
ibm/granite-7b-base
icefog72_Ice0.15-02.10-RP_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/icefog72/Ice0.15-02.10-RP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">icefog72/Ice0.15-02.10-RP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/icefog72__Ice0.15-02.10-RP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
icefog72/Ice0.15-02.10-RP
ab67a8b63836ec7c8e6729d79d9dfd2708b20eb3
21.491327
cc-by-nc-4.0
7
7
true
false
false
false
0.592822
0.534336
53.433556
0.497638
30.130104
0.057402
5.740181
0.277685
3.691275
0.431979
12.997396
0.306599
22.955452
true
false
2024-10-02
2024-10-02
0
icefog72/Ice0.15-02.10-RP
icefog72_Ice0.16-02.10-RP_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/icefog72/Ice0.16-02.10-RP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">icefog72/Ice0.16-02.10-RP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/icefog72__Ice0.16-02.10-RP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
icefog72/Ice0.16-02.10-RP
cb5c4d8a2e74efb41eae8b6dff8d06252c0a795d
21.051242
cc-by-nc-4.0
1
7
true
false
false
false
0.596756
0.506908
50.690834
0.494556
29.582321
0.057402
5.740181
0.279362
3.914989
0.433375
13.405208
0.306765
22.973921
true
false
2024-10-02
2024-10-02
0
icefog72/Ice0.16-02.10-RP
icefog72_Ice0.17-03.10-RP_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/icefog72/Ice0.17-03.10-RP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">icefog72/Ice0.17-03.10-RP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/icefog72__Ice0.17-03.10-RP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
icefog72/Ice0.17-03.10-RP
ca5a429546334784d94bcab0eb52c5f22f433680
21.414404
cc-by-nc-4.0
1
7
true
false
false
false
0.610206
0.512354
51.235389
0.500682
30.376262
0.061178
6.117825
0.281879
4.250559
0.433375
13.338542
0.308511
23.167849
true
false
2024-10-03
2024-10-03
0
icefog72/Ice0.17-03.10-RP
icefog72_Ice0.27-06.11-RP_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/icefog72/Ice0.27-06.11-RP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">icefog72/Ice0.27-06.11-RP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/icefog72__Ice0.27-06.11-RP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
icefog72/Ice0.27-06.11-RP
f2c78e71b59e0d36475217e3f265bc135f7c8505
21.831252
0
7
false
false
false
false
0.414351
0.491821
49.182059
0.511165
31.364752
0.056647
5.664653
0.312081
8.277405
0.432781
12.564323
0.315409
23.934323
false
false
2024-11-06
2024-11-06
1
icefog72/Ice0.27-06.11-RP (Merge)
icefog72_Ice0.29-06.11-RP_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/icefog72/Ice0.29-06.11-RP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">icefog72/Ice0.29-06.11-RP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/icefog72__Ice0.29-06.11-RP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
icefog72/Ice0.29-06.11-RP
932f16ea3f790553904f0d2dfcdc861d737cbaf7
21.703644
0
7
false
false
false
false
0.429817
0.48605
48.605035
0.508788
31.359454
0.055891
5.589124
0.302852
7.04698
0.445896
14.370312
0.309259
23.25096
false
false
2024-11-06
2024-11-06
1
icefog72/Ice0.29-06.11-RP (Merge)
icefog72_Ice0.31-08.11-RP_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/icefog72/Ice0.31-08.11-RP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">icefog72/Ice0.31-08.11-RP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/icefog72__Ice0.31-08.11-RP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
icefog72/Ice0.31-08.11-RP
52d947b170ee72c7f4c2b63b11f00330847e44f9
21.886899
cc-by-nc-4.0
1
7
true
false
false
false
0.463469
0.514577
51.457688
0.503213
30.460342
0.061178
6.117825
0.307886
7.718121
0.427667
11.891667
0.313082
23.675754
true
false
2024-11-08
2024-11-08
1
icefog72/Ice0.31-08.11-RP (Merge)
icefog72_Ice0.32-10.11-RP_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/icefog72/Ice0.32-10.11-RP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">icefog72/Ice0.32-10.11-RP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/icefog72__Ice0.32-10.11-RP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
icefog72/Ice0.32-10.11-RP
a05dbb7fe0e756afb73c19e6f33c5481a9ac2ba8
21.634831
cc-by-nc-4.0
1
7
true
false
false
false
0.424032
0.491546
49.154577
0.50477
30.43094
0.05136
5.135952
0.312081
8.277405
0.438208
13.476042
0.310007
23.334072
true
false
2024-11-11
2024-11-11
1
icefog72/Ice0.32-10.11-RP (Merge)
icefog72_Ice0.34b-14.11-RP_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/icefog72/Ice0.34b-14.11-RP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">icefog72/Ice0.34b-14.11-RP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/icefog72__Ice0.34b-14.11-RP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
icefog72/Ice0.34b-14.11-RP
5362f57fd0402c7c14c8dbe6b55c8b979cc8f475
21.681834
cc-by-nc-4.0
0
7
true
false
false
false
0.429823
0.476209
47.620868
0.50672
30.806357
0.064955
6.495468
0.309564
7.941834
0.44199
13.615365
0.3125
23.611111
true
false
2024-11-14
2024-11-14
1
icefog72/Ice0.34b-14.11-RP (Merge)
icefog72_Ice0.34n-14.11-RP_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/icefog72/Ice0.34n-14.11-RP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">icefog72/Ice0.34n-14.11-RP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/icefog72__Ice0.34n-14.11-RP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
icefog72/Ice0.34n-14.11-RP
1a39b99112926fc8dd44c3be35d99c04388d3078
21.828057
cc-by-nc-4.0
1
7
true
false
false
false
0.448548
0.478657
47.865663
0.509109
31.206253
0.069486
6.94864
0.313758
8.501119
0.437958
12.844792
0.312417
23.601876
true
false
2024-11-14
2024-11-14
1
icefog72/Ice0.34n-14.11-RP (Merge)
icefog72_Ice0.37-18.11-RP_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/icefog72/Ice0.37-18.11-RP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">icefog72/Ice0.37-18.11-RP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/icefog72__Ice0.37-18.11-RP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
icefog72/Ice0.37-18.11-RP
4d9dfaa52efdaede3291c85ccb9c5966636298e0
21.913941
cc-by-nc-4.0
1
7
true
false
false
false
0.414513
0.497216
49.721628
0.508431
31.04285
0.064199
6.41994
0.312081
8.277405
0.433927
12.207552
0.314328
23.814273
true
false
2024-11-18
2024-11-18
1
icefog72/Ice0.37-18.11-RP (Merge)
icefog72_Ice0.38-19.11-RP_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/icefog72/Ice0.38-19.11-RP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">icefog72/Ice0.38-19.11-RP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/icefog72__Ice0.38-19.11-RP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
icefog72/Ice0.38-19.11-RP
5d35120e4511369d97441c1732b3abf02bcc27ff
20.85153
0
7
false
false
false
false
0.422738
0.440338
44.03383
0.510108
31.330629
0.057402
5.740181
0.30453
7.270694
0.436719
12.95651
0.313996
23.777335
false
false
2024-11-19
0
Removed
icefog72_Ice0.39-19.11-RP_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/icefog72/Ice0.39-19.11-RP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">icefog72/Ice0.39-19.11-RP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/icefog72__Ice0.39-19.11-RP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
icefog72/Ice0.39-19.11-RP
044d7404646a13187ecabc5f87480a4e6bcaf18c
21.326177
0
7
false
false
false
false
0.425293
0.475659
47.565903
0.509299
31.263627
0.049094
4.909366
0.310403
8.053691
0.434146
12.534896
0.312666
23.62958
false
false
2024-11-20
0
Removed
icefog72_Ice0.40-20.11-RP_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/icefog72/Ice0.40-20.11-RP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">icefog72/Ice0.40-20.11-RP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/icefog72__Ice0.40-20.11-RP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
icefog72/Ice0.40-20.11-RP
4d8d429be08dc2e57be3e890797c8e861264aad5
21.76755
cc-by-nc-4.0
6
7
true
false
false
false
0.939995
0.476259
47.625855
0.509309
31.50524
0.062689
6.268882
0.307047
7.606264
0.444594
14.274219
0.309924
23.324837
true
false
2024-11-21
2024-11-21
1
icefog72/Ice0.40-20.11-RP (Merge)
icefog72_Ice0.41-22.11-RP_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/icefog72/Ice0.41-22.11-RP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">icefog72/Ice0.41-22.11-RP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/icefog72__Ice0.41-22.11-RP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
icefog72/Ice0.41-22.11-RP
d785cbcf4c7c25cf2c8ce1ad941c79810fc3ec59
18.972234
cc-by-nc-4.0
1
7
true
false
false
false
0.42731
0.462045
46.204515
0.472332
25.412777
0.02719
2.719033
0.286913
4.9217
0.455979
16.597396
0.261802
17.977985
true
false
2024-11-22
2024-11-22
1
icefog72/Ice0.41-22.11-RP (Merge)
icefog72_Ice0.7-29.09-RP_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/icefog72/Ice0.7-29.09-RP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">icefog72/Ice0.7-29.09-RP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/icefog72__Ice0.7-29.09-RP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
icefog72/Ice0.7-29.09-RP
932f2687137eebcafa9b90fe06e73ed272e0be81
21.587811
1
7
false
false
false
false
0.586854
0.517574
51.757448
0.504766
30.725876
0.068731
6.873112
0.287752
5.033557
0.423792
11.507292
0.312666
23.62958
false
false
2024-09-29
2024-10-03
1
icefog72/Ice0.7-29.09-RP (Merge)
icefog72_IceCocoaRP-7b_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/icefog72/IceCocoaRP-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">icefog72/IceCocoaRP-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/icefog72__IceCocoaRP-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
icefog72/IceCocoaRP-7b
001beaf88932f7e010af21bbdeff0079bda73b1d
20.959619
cc-by-nc-4.0
3
7
true
false
false
false
0.584942
0.496242
49.624219
0.49379
29.636895
0.059668
5.966767
0.295302
6.040268
0.419792
11.173958
0.30984
23.315603
true
false
2024-06-07
2024-06-26
1
icefog72/IceCocoaRP-7b (Merge)