eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
7 values
T
stringclasses
7 values
Weight type
stringclasses
3 values
Architecture
stringclasses
62 values
Model
stringlengths
355
689
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
0.74
52
Hub License
stringclasses
27 values
Hub ❤️
int64
0
5.99k
#Params (B)
float64
-1
141
Available on the hub
bool
2 classes
MoE
bool
2 classes
Flagged
bool
2 classes
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.03
107
IFEval Raw
float64
0
0.9
IFEval
float64
0
90
BBH Raw
float64
0.24
0.75
BBH
float64
0.25
64.1
MATH Lvl 5 Raw
float64
0
0.52
MATH Lvl 5
float64
0
52.4
GPQA Raw
float64
0.21
0.47
GPQA
float64
0
29.4
MUSR Raw
float64
0.29
0.6
MUSR
float64
0
38.5
MMLU-PRO Raw
float64
0.1
0.73
MMLU-PRO
float64
0
70
Merged
bool
2 classes
Official Providers
bool
2 classes
Upload To Hub Date
stringclasses
480 values
Submission Date
stringclasses
220 values
Generation
int64
0
10
Base Model
stringlengths
4
102
Enno-Ai_EnnoAi-Pro-Llama-3-8B-v0.3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Enno-Ai/EnnoAi-Pro-Llama-3-8B-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Enno-Ai/EnnoAi-Pro-Llama-3-8B-v0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Enno-Ai__EnnoAi-Pro-Llama-3-8B-v0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Enno-Ai/EnnoAi-Pro-Llama-3-8B-v0.3
cf29b8b484a909132e3a1f85ce891d28347c0d13
17.524058
creativeml-openrail-m
0
8.03
true
false
false
true
1.470836
0.508257
50.825698
0.410058
16.668386
0.012085
1.208459
0.265101
2.013423
0.423573
12.313281
0.299036
22.1151
false
false
2024-06-26
2024-06-26
0
Enno-Ai/EnnoAi-Pro-Llama-3-8B-v0.3
Enno-Ai_EnnoAi-Pro-Llama-3.1-8B-v0.9_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Enno-Ai/EnnoAi-Pro-Llama-3.1-8B-v0.9" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Enno-Ai/EnnoAi-Pro-Llama-3.1-8B-v0.9</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Enno-Ai__EnnoAi-Pro-Llama-3.1-8B-v0.9-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Enno-Ai/EnnoAi-Pro-Llama-3.1-8B-v0.9
c740871122fd471a1a225cf2b4368e333752d74c
14.945694
apache-2.0
0
8.03
true
false
false
true
0.932571
0.468915
46.89147
0.416027
17.498296
0
0
0.26594
2.12528
0.383177
5.430469
0.259558
17.72865
false
false
2024-08-22
2024-09-06
0
Enno-Ai/EnnoAi-Pro-Llama-3.1-8B-v0.9
EnnoAi_EnnoAi-Pro-Llama-3.1-8B-v1.0_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/EnnoAi/EnnoAi-Pro-Llama-3.1-8B-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EnnoAi/EnnoAi-Pro-Llama-3.1-8B-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EnnoAi__EnnoAi-Pro-Llama-3.1-8B-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EnnoAi/EnnoAi-Pro-Llama-3.1-8B-v1.0
c740871122fd471a1a225cf2b4368e333752d74c
14.97109
apache-2.0
0
8.03
true
false
false
true
0.945642
0.470438
47.043844
0.416027
17.498296
0
0
0.26594
2.12528
0.383177
5.430469
0.259558
17.72865
false
false
2024-08-22
2024-09-06
0
EnnoAi/EnnoAi-Pro-Llama-3.1-8B-v1.0
Epiculous_Azure_Dusk-v0.2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Epiculous/Azure_Dusk-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Epiculous/Azure_Dusk-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Epiculous__Azure_Dusk-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Epiculous/Azure_Dusk-v0.2
ebddf1b2efbe7f9cae066d263b0991ded89c88e8
14.050827
apache-2.0
7
12.248
true
false
false
true
1.991411
0.346716
34.67156
0.411972
17.396414
0.018127
1.812689
0.260906
1.454139
0.383458
6.365625
0.303441
22.604536
false
false
2024-09-09
2024-09-14
0
Epiculous/Azure_Dusk-v0.2
Epiculous_Crimson_Dawn-v0.2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Epiculous/Crimson_Dawn-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Epiculous/Crimson_Dawn-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Epiculous__Crimson_Dawn-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Epiculous/Crimson_Dawn-v0.2
4cceb1e25026afef241ad5325097e88eccd8f37a
14.884541
apache-2.0
12
12.248
true
false
false
true
3.492384
0.310345
31.034544
0.448238
21.688249
0.030967
3.096677
0.276007
3.467562
0.415177
10.897135
0.272108
19.123079
false
false
2024-09-02
2024-09-05
0
Epiculous/Crimson_Dawn-v0.2
Epiculous_NovaSpark_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Epiculous/NovaSpark" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Epiculous/NovaSpark</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Epiculous__NovaSpark-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Epiculous/NovaSpark
a46340895859e470c3e69661f0b894677cf4c5cb
25.228562
apache-2.0
7
8.03
true
false
false
true
0.818185
0.640847
64.08474
0.506396
29.526911
0.150302
15.030211
0.297819
6.375839
0.388198
6.92474
0.36486
29.42893
false
false
2024-10-13
2024-10-20
1
Epiculous/NovaSpark (Merge)
Epiculous_Violet_Twilight-v0.2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Epiculous/Violet_Twilight-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Epiculous/Violet_Twilight-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Epiculous__Violet_Twilight-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Epiculous/Violet_Twilight-v0.2
30c8bad3c1f565150afbf2fc90cacf4f45d096f6
18.552773
apache-2.0
22
12.248
true
false
false
true
1.770436
0.453178
45.317757
0.461455
23.940537
0.028701
2.870091
0.26594
2.12528
0.429938
13.608854
0.311087
23.454122
true
false
2024-09-12
2024-09-16
0
Epiculous/Violet_Twilight-v0.2
EpistemeAI_Alpaca-Llama3.1-8B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI/Alpaca-Llama3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Alpaca-Llama3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Alpaca-Llama3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI/Alpaca-Llama3.1-8B
3152dfa17322dff7c6af6dbf3daceaf5db51e230
13.922106
apache-2.0
0
8
true
false
false
false
0.920853
0.159869
15.986915
0.475526
25.935227
0.046828
4.682779
0.290268
5.369128
0.34026
6.599219
0.324634
24.959368
false
false
2024-09-11
2024-08-13
2
meta-llama/Meta-Llama-3.1-8B
EpistemeAI_Athena-gemma-2-2b-it_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI/Athena-gemma-2-2b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Athena-gemma-2-2b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Athena-gemma-2-2b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI/Athena-gemma-2-2b-it
661c1dc6a1a096222e33416e099bd02b7b970405
14.294329
apache-2.0
2
2
true
false
false
false
2.036798
0.313417
31.341729
0.426423
19.417818
0.033988
3.398792
0.268456
2.46085
0.435052
13.348177
0.242188
15.798611
false
false
2024-08-29
2024-09-06
4
google/gemma-2-9b
EpistemeAI_Athena-gemma-2-2b-it-Philos_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI/Athena-gemma-2-2b-it-Philos" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Athena-gemma-2-2b-it-Philos</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Athena-gemma-2-2b-it-Philos-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI/Athena-gemma-2-2b-it-Philos
dea2b35d496bd32ed3c88d42ff3022654153f2e1
15.122657
apache-2.0
0
2
true
false
false
true
1.128593
0.462095
46.209502
0.379478
13.212088
0.004532
0.453172
0.28104
4.138702
0.431365
12.853906
0.224817
13.868573
false
false
2024-09-05
2024-09-05
1
unsloth/gemma-2-2b-it-bnb-4bit
EpistemeAI_Athene-codegemma-2-7b-it-alpaca-v1.3_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI/Athene-codegemma-2-7b-it-alpaca-v1.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Athene-codegemma-2-7b-it-alpaca-v1.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Athene-codegemma-2-7b-it-alpaca-v1.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI/Athene-codegemma-2-7b-it-alpaca-v1.3
9c26e1242a11178b53937bc0e9a744ef6141e05a
17.314022
apache-2.0
0
7
true
false
false
false
0.971978
0.402994
40.299406
0.433192
20.873795
0.061934
6.193353
0.280201
4.026846
0.450302
14.854427
0.258727
17.636303
false
false
2024-09-06
2024-09-06
2
Removed
EpistemeAI_FineLlama3.1-8B-Instruct_4bit
4bit
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/EpistemeAI/FineLlama3.1-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/FineLlama3.1-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__FineLlama3.1-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI/FineLlama3.1-8B-Instruct
a8b0fc584b10e0110e04f9d21c7f10d24391c1d5
11.100787
0
14.483
false
false
false
false
2.354961
0.08001
8.000993
0.455736
23.506619
0.026435
2.643505
0.280201
4.026846
0.348167
4.954167
0.311253
23.472592
false
false
2024-08-10
0
Removed
EpistemeAI_Fireball-12B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI/Fireball-12B
e2ed12c3244f2502321fb20e76dfc72ad7817d6e
15.509355
apache-2.0
1
12.248
true
false
false
false
1.618521
0.18335
18.335018
0.511089
30.666712
0.039275
3.927492
0.261745
1.565996
0.423635
12.521094
0.334358
26.03982
false
false
2024-08-20
2024-08-21
2
Removed
EpistemeAI_Fireball-12B-v1.13a-philosophers_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-12B-v1.13a-philosophers" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-12B-v1.13a-philosophers</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-12B-v1.13a-philosophers-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI/Fireball-12B-v1.13a-philosophers
7fa824d4a40abca3f8c75d432ea151dc0d1d67d6
14.440865
apache-2.0
2
12
true
false
false
false
1.662663
0.087553
8.755325
0.51027
30.336233
0.044562
4.456193
0.301174
6.823266
0.408073
9.975781
0.336686
26.298389
false
false
2024-08-28
2024-09-03
1
Removed
EpistemeAI_Fireball-Alpaca-Llama-3.1-8B-Philos-DPO-200_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Alpaca-Llama-3.1-8B-Philos-DPO-200" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Alpaca-Llama-3.1-8B-Philos-DPO-200</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Alpaca-Llama-3.1-8B-Philos-DPO-200-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI/Fireball-Alpaca-Llama-3.1-8B-Philos-DPO-200
27d67626304954db71f21fec9e7fc516421274ec
21.066974
apache-2.0
0
8
true
false
false
false
0.922381
0.457724
45.772439
0.48384
26.377774
0.119335
11.933535
0.300336
6.711409
0.394458
6.907292
0.358295
28.699394
false
false
2024-09-16
2024-09-16
4
meta-llama/Meta-Llama-3.1-8B
EpistemeAI_Fireball-Alpaca-Llama3.1.07-8B-Philos-Math-KTO-beta_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Alpaca-Llama3.1.07-8B-Philos-Math-KTO-beta" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Alpaca-Llama3.1.07-8B-Philos-Math-KTO-beta</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Alpaca-Llama3.1.07-8B-Philos-Math-KTO-beta-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI/Fireball-Alpaca-Llama3.1.07-8B-Philos-Math-KTO-beta
2851384717556dd6ac14c00ed87aac1f267eb263
25.179287
apache-2.0
0
8
true
false
false
true
0.885645
0.727401
72.740107
0.486489
26.897964
0.148792
14.879154
0.280201
4.026846
0.361938
4.275521
0.354305
28.256132
false
false
2024-09-12
2024-09-14
5
meta-llama/Meta-Llama-3.1-8B
EpistemeAI_Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI/Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R2
b19336101aa5f4807d1574f4c11eebc1c1a1c34e
22.537889
apache-2.0
0
8
true
false
false
false
0.811743
0.467316
46.731561
0.493203
28.247009
0.123112
12.311178
0.286074
4.809843
0.462365
16.995573
0.335189
26.132166
false
false
2024-09-14
2024-09-14
3
meta-llama/Meta-Llama-3.1-8B
EpistemeAI_Fireball-Meta-Llama-3.1-8B-Instruct-0.001-128K-auto_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-0.001-128K-auto" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-0.001-128K-auto</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Meta-Llama-3.1-8B-Instruct-0.001-128K-auto-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-0.001-128K-auto
19b23c434b6c4524e2146926cdbf4f0e927ae3ab
21.57995
apache-2.0
0
8
true
false
false
false
0.694994
0.443186
44.31863
0.482364
26.832967
0.133686
13.36858
0.312081
8.277405
0.406646
8.730729
0.351563
27.951389
false
false
2024-11-14
2024-11-15
2
meta-llama/Meta-Llama-3.1-8B
EpistemeAI_Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K
b4a88fb5fb27fc5d8a503303cdb7aaeff373fd92
20.627168
apache-2.0
3
8
true
false
false
false
0.814786
0.445734
44.573399
0.489732
28.025161
0.120846
12.084592
0.294463
5.928412
0.376229
4.895312
0.354305
28.256132
false
false
2024-09-26
2024-10-05
1
Removed
EpistemeAI_Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code
8e8f1569a8a01ed3d6588f2669c730d4993355b5
23.89695
apache-2.0
2
8
true
false
false
false
0.854318
0.597533
59.753343
0.490419
28.171888
0.13142
13.141994
0.302013
6.935123
0.401031
8.46224
0.342254
26.91711
false
false
2024-10-04
2024-10-05
2
Removed
EpistemeAI_Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds
8b73dd02349f0544c48c581cc73ada5cac6ff946
22.993108
llama3.1
2
8
true
false
false
true
1.716734
0.669099
66.90991
0.466807
24.462654
0.124622
12.462236
0.272651
3.020134
0.341781
4.55599
0.33893
26.547725
false
false
2024-10-14
2024-10-15
4
Removed
EpistemeAI_Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto
f18598c62a783bcc0d436a35df0c8a335e8ee5d7
23.749941
apache-2.0
8
8.03
true
false
false
true
2.285306
0.730498
73.049841
0.464925
24.586737
0.139728
13.97281
0.26594
2.12528
0.320885
1.210677
0.347989
27.5543
false
false
2024-10-21
2024-10-29
1
EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto (Merge)
EpistemeAI_Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto
055e87600d18e58594a8d193f45c0ee9a90e1780
23.488818
apache-2.0
8
8.03
true
false
false
true
0.672068
0.720707
72.070661
0.461009
23.544253
0.123112
12.311178
0.270134
2.684564
0.34324
4.171615
0.335356
26.150635
false
false
2024-10-21
2024-11-27
1
EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto (Merge)
EpistemeAI_Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.004-128K-code-COT_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.004-128K-code-COT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.004-128K-code-COT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.004-128K-code-COT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.004-128K-code-COT
bb90c19dc7c4a509e7bd73f4620dca818b58be25
20.832251
apache-2.0
0
8
true
false
false
false
0.839037
0.457824
45.782413
0.476052
25.820865
0.136707
13.670695
0.293624
5.816555
0.388135
6.45026
0.347074
27.452719
false
false
2024-10-11
2024-10-11
3
Removed
EpistemeAI_Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.004-128K-code-ds-auto_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.004-128K-code-ds-auto" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.004-128K-code-ds-auto</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.004-128K-code-ds-auto-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.004-128K-code-ds-auto
db5ddb161ed26bc16baa814e31892dbe2f22b7a0
23.760965
apache-2.0
1
8
true
false
false
true
0.745131
0.720482
72.048166
0.48178
26.45206
0.136707
13.670695
0.248322
0
0.33
2.083333
0.354804
28.31154
false
false
2024-11-14
2024-11-14
1
EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.004-128K-code-ds-auto (Merge)
EpistemeAI_Fireball-Meta-Llama-3.1-8B-Instruct-Math_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Math" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Math</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Meta-Llama-3.1-8B-Instruct-Math-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Math
677c97b4f92bfc330d4fae628e9a1df1ef606dcc
20.545341
apache-2.0
0
8.03
true
false
false
false
0.910272
0.462296
46.22956
0.498295
28.959344
0.107251
10.725076
0.291107
5.480984
0.364073
5.975781
0.333112
25.9013
false
false
2024-09-23
2024-09-23
3
meta-llama/Meta-Llama-3.1-8B
EpistemeAI_Fireball-Meta-Llama-3.2-8B-Instruct-agent-003-128k-code-DPO_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Meta-Llama-3.2-8B-Instruct-agent-003-128k-code-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Meta-Llama-3.2-8B-Instruct-agent-003-128k-code-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Meta-Llama-3.2-8B-Instruct-agent-003-128k-code-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI/Fireball-Meta-Llama-3.2-8B-Instruct-agent-003-128k-code-DPO
b3c0fce7daa359cd8ed5be6595dd1a76ca2cfea2
21.205445
apache-2.0
1
8
true
false
false
false
0.833576
0.461097
46.109656
0.480101
26.317878
0.120091
12.009063
0.300336
6.711409
0.399823
8.077865
0.352061
28.006797
false
false
2024-10-08
2024-10-09
3
Removed
EpistemeAI_Fireball-Mistral-Nemo-Base-2407-v1-DPO2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Mistral-Nemo-Base-2407-v1-DPO2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Mistral-Nemo-Base-2407-v1-DPO2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Mistral-Nemo-Base-2407-v1-DPO2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI/Fireball-Mistral-Nemo-Base-2407-v1-DPO2
2cf732fbffefdf37341b946edd7995f14d3f9487
15.2764
apache-2.0
0
12.248
true
false
false
false
1.771269
0.186073
18.607295
0.496777
28.567825
0.032477
3.247734
0.291946
5.592841
0.40401
9.501302
0.335273
26.141401
false
false
2024-08-19
2024-08-19
1
Removed
EpistemeAI_Llama-3.2-3B-Agent007-Coder_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI/Llama-3.2-3B-Agent007-Coder" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Llama-3.2-3B-Agent007-Coder</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Llama-3.2-3B-Agent007-Coder-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI/Llama-3.2-3B-Agent007-Coder
7ff4e77796b6d308e96d0150e1a01081c0b82e01
18.901974
apache-2.0
0
3
true
false
false
false
0.710816
0.539956
53.995621
0.430376
19.025809
0.110272
11.02719
0.25755
1.006711
0.366802
7.783594
0.285156
20.572917
false
false
2024-10-08
2024-10-08
2
meta-llama/Llama-3.2-3B-Instruct
EpistemeAI_Mistral-Nemo-Instruct-12B-Philosophy-Math_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI/Mistral-Nemo-Instruct-12B-Philosophy-Math" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Mistral-Nemo-Instruct-12B-Philosophy-Math</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Mistral-Nemo-Instruct-12B-Philosophy-Math-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI/Mistral-Nemo-Instruct-12B-Philosophy-Math
1ac4205f8da109326b4a5cf173e5491a20087d76
16.566232
apache-2.0
0
12.248
true
false
false
false
1.363607
0.069468
6.94679
0.536493
33.835811
0.093656
9.365559
0.331376
10.850112
0.429219
12.885677
0.329621
25.513446
false
false
2024-09-15
2024-09-26
1
unsloth/Mistral-Nemo-Instruct-2407-bnb-4bit
EpistemeAI_Polypsyche-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto-Empathy_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI/Polypsyche-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto-Empathy" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Polypsyche-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto-Empathy</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Polypsyche-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto-Empathy-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI/Polypsyche-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto-Empathy
daabf0dcd2915991531abac59da346f27864c7e7
23.170589
apache-2.0
0
8
true
false
false
true
0.667831
0.71009
71.009034
0.462799
24.419414
0.130665
13.066465
0.276846
3.579418
0.31949
1.269531
0.331117
25.679669
false
false
2024-12-13
2024-12-13
2
EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto (Merge)
EpistemeAI_Polypsyche-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto-Logic_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI/Polypsyche-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto-Logic" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Polypsyche-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto-Logic</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Polypsyche-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto-Logic-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI/Polypsyche-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto-Logic
20a0141e08db10f1d0ffb771676e56c7d2045acf
23.104196
apache-2.0
1
8.03
true
false
false
true
0.684498
0.712214
71.221359
0.456594
23.576451
0.11858
11.858006
0.284396
4.58613
0.32349
1.269531
0.335023
26.113697
false
false
2024-12-13
2024-12-20
2
EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto (Merge)
EpistemeAI_Polypsyche-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto-divergent_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI/Polypsyche-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto-divergent" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Polypsyche-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto-divergent</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Polypsyche-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto-divergent-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI/Polypsyche-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto-divergent
3cba0f0085c1f95f011cbf76d35a2303c54b2141
22.586672
apache-2.0
1
8.03
true
false
false
true
0.697942
0.691531
69.153069
0.452473
22.890368
0.102719
10.271903
0.266779
2.237136
0.35775
5.51875
0.329039
25.448803
false
false
2024-12-13
2024-12-20
2
EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto (Merge)
EpistemeAI_Reasoning-Llama-3.1-CoT-RE1-NMT_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI/Reasoning-Llama-3.1-CoT-RE1-NMT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Reasoning-Llama-3.1-CoT-RE1-NMT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Reasoning-Llama-3.1-CoT-RE1-NMT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI/Reasoning-Llama-3.1-CoT-RE1-NMT
3ae39e39a02ff222a7436499462261b22ca28367
18.389948
apache-2.0
0
8.03
true
false
false
true
0.72392
0.482853
48.285327
0.473576
25.544054
0.080816
8.081571
0.260906
1.454139
0.318219
0.94401
0.334275
26.030585
false
false
2025-01-29
2025-01-29
0
EpistemeAI/Reasoning-Llama-3.1-CoT-RE1-NMT
EpistemeAI2_Athene-codegemma-2-7b-it-alpaca-v1.2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Athene-codegemma-2-7b-it-alpaca-v1.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Athene-codegemma-2-7b-it-alpaca-v1.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Athene-codegemma-2-7b-it-alpaca-v1.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI2/Athene-codegemma-2-7b-it-alpaca-v1.2
21b31062334a316b50680e8c3a141a72e4c30b61
15.693215
apache-2.0
0
7
true
false
false
false
0.969635
0.435118
43.511771
0.417542
18.97137
0.040785
4.07855
0.270973
2.796421
0.416969
10.38776
0.229721
14.413416
false
false
2024-08-26
2024-08-26
2
Removed
EpistemeAI2_Fireball-12B-v1.2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-12B-v1.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-12B-v1.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-12B-v1.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI2/Fireball-12B-v1.2
57af42edf8232189ee99e9a21e33a0c306e3f561
15.162522
apache-2.0
1
12
true
false
false
false
1.872565
0.135539
13.553926
0.501858
29.776014
0.039275
3.927492
0.298658
6.487696
0.417313
11.264062
0.333693
25.965943
false
false
2024-08-27
2024-08-28
1
Removed
EpistemeAI2_Fireball-Alpaca-Llama3.1-8B-Philos_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Alpaca-Llama3.1-8B-Philos" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Alpaca-Llama3.1-8B-Philos</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Alpaca-Llama3.1-8B-Philos-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI2/Fireball-Alpaca-Llama3.1-8B-Philos
3dcca4cf9bdd9003c8dc91f5c78cefef1d4ae0d7
22.539085
apache-2.0
1
8
true
false
false
false
0.848332
0.49864
49.864027
0.497758
29.259226
0.117825
11.782477
0.292785
5.704698
0.427667
11.891667
0.340592
26.732417
false
false
2024-08-29
2024-08-29
3
meta-llama/Meta-Llama-3.1-8B
EpistemeAI2_Fireball-Alpaca-Llama3.1.01-8B-Philos_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Alpaca-Llama3.1.01-8B-Philos" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Alpaca-Llama3.1.01-8B-Philos</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Alpaca-Llama3.1.01-8B-Philos-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI2/Fireball-Alpaca-Llama3.1.01-8B-Philos
f97293ed5cec7fb9482b16600259967c6c923e4b
21.567144
apache-2.0
0
8
true
false
false
false
0.870572
0.421179
42.117914
0.495611
28.628475
0.135952
13.595166
0.288591
5.145414
0.437062
13.432813
0.338348
26.483082
false
false
2024-09-03
2024-09-03
3
meta-llama/Meta-Llama-3.1-8B
EpistemeAI2_Fireball-Alpaca-Llama3.1.03-8B-Philos_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Alpaca-Llama3.1.03-8B-Philos" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Alpaca-Llama3.1.03-8B-Philos</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Alpaca-Llama3.1.03-8B-Philos-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI2/Fireball-Alpaca-Llama3.1.03-8B-Philos
6e60f783f80f7d126b8e4f2b417e14dea63d2c4f
20.29975
apache-2.0
0
8
true
false
false
false
0.797523
0.388081
38.80814
0.495087
27.992549
0.129909
12.990937
0.278523
3.803132
0.42801
12.034635
0.335522
26.169105
false
false
2024-09-04
2024-09-04
3
meta-llama/Meta-Llama-3.1-8B
EpistemeAI2_Fireball-Alpaca-Llama3.1.04-8B-Philos_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Alpaca-Llama3.1.04-8B-Philos" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Alpaca-Llama3.1.04-8B-Philos</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Alpaca-Llama3.1.04-8B-Philos-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI2/Fireball-Alpaca-Llama3.1.04-8B-Philos
efd0c251373e1a2fa2bc8cead502c03ff6dc7c8b
21.031577
apache-2.0
0
8
true
false
false
false
0.765248
0.40844
40.843961
0.493001
27.963798
0.116314
11.63142
0.290268
5.369128
0.437219
13.685677
0.340259
26.695479
false
false
2024-09-05
2024-09-05
3
meta-llama/Meta-Llama-3.1-8B
EpistemeAI2_Fireball-Alpaca-Llama3.1.06-8B-Philos-dpo_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Alpaca-Llama3.1.06-8B-Philos-dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Alpaca-Llama3.1.06-8B-Philos-dpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Alpaca-Llama3.1.06-8B-Philos-dpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI2/Fireball-Alpaca-Llama3.1.06-8B-Philos-dpo
3e76f190b505b515479cc25e92f8229c2b05159f
21.829867
apache-2.0
0
8
true
false
false
false
0.934774
0.486576
48.657562
0.488077
27.207177
0.128399
12.839879
0.297819
6.375839
0.393188
6.848437
0.361453
29.05031
false
false
2024-09-09
2024-09-09
5
meta-llama/Meta-Llama-3.1-8B
EpistemeAI2_Fireball-Alpaca-Llama3.1.07-8B-Philos-Math_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Alpaca-Llama3.1.07-8B-Philos-Math" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Alpaca-Llama3.1.07-8B-Philos-Math</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Alpaca-Llama3.1.07-8B-Philos-Math-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI2/Fireball-Alpaca-Llama3.1.07-8B-Philos-Math
0b2842bddfa6c308f67eb5a20daf04536a4e6d1a
21.870165
apache-2.0
0
8
true
false
false
false
0.90203
0.507908
50.790791
0.484702
26.901201
0.114048
11.404834
0.296141
6.152125
0.406302
7.854427
0.353059
28.117612
false
false
2024-09-10
2024-09-10
4
meta-llama/Meta-Llama-3.1-8B
EpistemeAI2_Fireball-Alpaca-Llama3.1.08-8B-C-R1-KTO-Reflection_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Alpaca-Llama3.1.08-8B-C-R1-KTO-Reflection" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Alpaca-Llama3.1.08-8B-C-R1-KTO-Reflection</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Alpaca-Llama3.1.08-8B-C-R1-KTO-Reflection-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI2/Fireball-Alpaca-Llama3.1.08-8B-C-R1-KTO-Reflection
dc900138b4406353b7e84251bc8649d70c16f13f
20.882037
apache-2.0
0
8
true
false
false
false
0.883974
0.395226
39.522578
0.495531
27.571611
0.123867
12.386707
0.299497
6.599553
0.404813
10.401563
0.359292
28.81021
false
false
2024-09-16
2024-09-16
6
meta-llama/Meta-Llama-3.1-8B
EpistemeAI2_Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI2/Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R1
c57c786426123635baf6c8b4d30638d2053f4565
22.410483
apache-2.0
0
8
true
false
false
false
0.909759
0.531638
53.163828
0.482793
26.763685
0.117825
11.782477
0.29698
6.263982
0.410302
8.454427
0.352311
28.034501
false
false
2024-09-13
2024-09-13
4
meta-llama/Meta-Llama-3.1-8B
EpistemeAI2_Fireball-Llama-3.1-8B-Philos-Reflection_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Llama-3.1-8B-Philos-Reflection" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Llama-3.1-8B-Philos-Reflection</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Llama-3.1-8B-Philos-Reflection-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI2/Fireball-Llama-3.1-8B-Philos-Reflection
4b0b75d9235886e8a947c45b94f87c5a65a81467
20.389309
apache-2.0
0
8
true
false
false
false
0.894943
0.359605
35.960474
0.489769
27.769796
0.129154
12.915408
0.307886
7.718121
0.395729
9.632813
0.355053
28.339243
false
false
2024-09-17
2024-09-17
5
meta-llama/Meta-Llama-3.1-8B
EpistemeAI2_Fireball-MathMistral-Nemo-Base-2407-v2dpo_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-MathMistral-Nemo-Base-2407-v2dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-MathMistral-Nemo-Base-2407-v2dpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-MathMistral-Nemo-Base-2407-v2dpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI2/Fireball-MathMistral-Nemo-Base-2407-v2dpo
6b7d851c66359f39d16da6fbcf810b816dc6e4bc
11.332218
apache-2.0
1
11.58
true
false
false
true
1.881426
0.30972
30.972043
0.432764
21.145528
0.034743
3.47432
0.263423
1.789709
0.402958
8.969792
0.114777
1.641918
false
false
2024-08-21
2024-08-24
2
unsloth/Mistral-Nemo-Base-2407-bnb-4bit
EpistemeAI2_Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-math_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-math" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-math</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-math-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI2/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-math
aa21037cf0984cb293facb69c41895e7fccb1340
22.677605
apache-2.0
0
8
true
false
false
false
0.791683
0.551547
55.154656
0.480756
26.743767
0.132175
13.217523
0.30453
7.270694
0.36925
6.789583
0.342005
26.889406
false
false
2024-10-11
2024-10-12
3
Removed
EpistemeAI2_Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.005-128K-code-COT_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.005-128K-code-COT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.005-128K-code-COT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.005-128K-code-COT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI2/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.005-128K-code-COT
cf8b99d4aa00c18fdaebfb24fa3c674ee6defa1a
20.999994
apache-2.0
0
8
true
false
false
false
0.800818
0.46332
46.331955
0.479083
26.400992
0.114804
11.480363
0.312081
8.277405
0.377438
5.013021
0.356466
28.496232
false
false
2024-10-11
2024-10-11
3
Removed
EpistemeAI2_Fireball-Phi-3-medium-4k-inst-Philos_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Phi-3-medium-4k-inst-Philos" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Phi-3-medium-4k-inst-Philos</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Phi-3-medium-4k-inst-Philos-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EpistemeAI2/Fireball-Phi-3-medium-4k-inst-Philos
147715051102034fac98091e2a0cae6cade15ae0
29.172842
apache-2.0
0
13.96
true
false
false
true
0.771814
0.531288
53.128809
0.617784
46.208873
0.140483
14.048338
0.332215
10.961969
0.413906
10.704948
0.459857
39.984116
false
false
2024-09-19
2024-09-20
1
unsloth/phi-3-medium-4k-instruct-bnb-4bit
Eric111_CatunaMayo_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Eric111/CatunaMayo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Eric111/CatunaMayo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Eric111__CatunaMayo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Eric111/CatunaMayo
23337893381293975cbcc35f75b634954fbcefaf
21.299155
apache-2.0
0
7.242
true
false
false
false
0.550825
0.407416
40.741566
0.524364
33.299426
0.086103
8.610272
0.291946
5.592841
0.45399
15.348698
0.317819
24.202128
true
false
2024-02-15
2024-07-03
0
Eric111/CatunaMayo
Eric111_CatunaMayo-DPO_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Eric111/CatunaMayo-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Eric111/CatunaMayo-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Eric111__CatunaMayo-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Eric111/CatunaMayo-DPO
6bdbe06c10d57d152dd8a79a71edd8e30135b689
21.255121
apache-2.0
1
7.242
true
false
false
false
0.554023
0.421454
42.145396
0.522399
33.089952
0.079305
7.930514
0.291946
5.592841
0.445031
14.66224
0.316988
24.109781
true
false
2024-02-21
2024-06-27
0
Eric111/CatunaMayo-DPO
Etherll_Chocolatine-3B-Instruct-DPO-Revised-Ties_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/Etherll/Chocolatine-3B-Instruct-DPO-Revised-Ties" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Etherll/Chocolatine-3B-Instruct-DPO-Revised-Ties</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Etherll__Chocolatine-3B-Instruct-DPO-Revised-Ties-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Etherll/Chocolatine-3B-Instruct-DPO-Revised-Ties
8a9c3d745e0805e769b544622b3f5c039abc9b07
24.402767
0
3.821
false
false
false
false
0.635497
0.372469
37.246949
0.541065
35.583343
0.128399
12.839879
0.323826
9.8434
0.464938
17.817187
0.397773
33.085845
false
false
2024-10-28
0
Removed
Etherll_Chocolatine-3B-Instruct-DPO-Revised-Ties-v2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/Etherll/Chocolatine-3B-Instruct-DPO-Revised-Ties-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Etherll/Chocolatine-3B-Instruct-DPO-Revised-Ties-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Etherll__Chocolatine-3B-Instruct-DPO-Revised-Ties-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Etherll/Chocolatine-3B-Instruct-DPO-Revised-Ties-v2
121b0831361743558e1a56fd89ae3d3c03272cc4
24.428163
0
3.821
false
false
false
false
0.631296
0.373993
37.399323
0.541065
35.583343
0.128399
12.839879
0.323826
9.8434
0.464938
17.817187
0.397773
33.085845
false
false
2024-10-29
0
Removed
Etherll_Herplete-LLM-Llama-3.1-8b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Etherll/Herplete-LLM-Llama-3.1-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Etherll/Herplete-LLM-Llama-3.1-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Etherll__Herplete-LLM-Llama-3.1-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Etherll/Herplete-LLM-Llama-3.1-8b
b3829cf437216f099c031a9ab5e4c8ec974766dd
19.588708
5
8.03
false
false
false
true
0.973685
0.467191
46.71915
0.501343
28.952591
0.027946
2.794562
0.286074
4.809843
0.386
6.683333
0.348155
27.572769
false
false
2024-08-24
2024-08-29
1
Etherll/Herplete-LLM-Llama-3.1-8b (Merge)
Etherll_Herplete-LLM-Llama-3.1-8b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Etherll/Herplete-LLM-Llama-3.1-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Etherll/Herplete-LLM-Llama-3.1-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Etherll__Herplete-LLM-Llama-3.1-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Etherll/Herplete-LLM-Llama-3.1-8b
d1383d993fad005d515be4d815797019601c679f
26.260139
5
8.03
false
false
false
false
0.854807
0.610598
61.059766
0.534725
33.206608
0.154834
15.483384
0.314597
8.612975
0.399052
8.614844
0.375249
30.583259
false
false
2024-08-24
2024-10-18
1
Etherll/Herplete-LLM-Llama-3.1-8b (Merge)
Etherll_Herplete-LLM-Llama-3.1-8b-Ties_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Etherll/Herplete-LLM-Llama-3.1-8b-Ties" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Etherll/Herplete-LLM-Llama-3.1-8b-Ties</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Etherll__Herplete-LLM-Llama-3.1-8b-Ties-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Etherll/Herplete-LLM-Llama-3.1-8b-Ties
26.571056
0
8.03
false
false
false
false
0.862201
0.616368
61.63679
0.533798
33.07089
0.162387
16.238671
0.317114
8.948546
0.401719
8.948177
0.375249
30.583259
false
false
2024-10-03
2024-10-17
1
Etherll/Herplete-LLM-Llama-3.1-8b-Ties (Merge)
Etherll_Qwen2.5-7B-della-test_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Etherll/Qwen2.5-7B-della-test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Etherll/Qwen2.5-7B-della-test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Etherll__Qwen2.5-7B-della-test-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Etherll/Qwen2.5-7B-della-test
c2b2ffc38627e68e7b43a1b596dc16ee93c1c63b
27.659468
1
7.616
false
false
false
true
1.385742
0.762497
76.249684
0.544733
35.546894
0
0
0.308725
7.829978
0.404698
8.98724
0.436087
37.343011
false
false
2024-11-01
2024-11-14
1
Etherll/Qwen2.5-7B-della-test (Merge)
Etherll_Qwen2.5-Coder-7B-Instruct-Ties_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Etherll/Qwen2.5-Coder-7B-Instruct-Ties" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Etherll/Qwen2.5-Coder-7B-Instruct-Ties</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Etherll__Qwen2.5-Coder-7B-Instruct-Ties-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Etherll/Qwen2.5-Coder-7B-Instruct-Ties
d8c1624a2fa60f05030e34a128af391b5d8be332
24.474445
0
7.616
false
false
false
false
1.197181
0.500539
50.053857
0.489514
28.008294
0.169184
16.918429
0.329698
10.626398
0.437281
13.426823
0.350316
27.812869
false
false
2024-09-30
2024-10-28
1
Etherll/Qwen2.5-Coder-7B-Instruct-Ties (Merge)
Etherll_Replete-LLM-V3-Llama-3.1-8b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Etherll/Replete-LLM-V3-Llama-3.1-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Etherll/Replete-LLM-V3-Llama-3.1-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Etherll__Replete-LLM-V3-Llama-3.1-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Etherll/Replete-LLM-V3-Llama-3.1-8b
e79849d72f70ef74677ed81a8885403973b2470c
17.927882
5
8.03
false
false
false
true
0.789329
0.526292
52.629246
0.454338
22.902455
0.000755
0.075529
0.268456
2.46085
0.351646
2.055729
0.346991
27.443484
false
false
2024-08-24
2024-08-26
1
Etherll/Replete-LLM-V3-Llama-3.1-8b (Merge)
Etherll_SuperHermes_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Etherll/SuperHermes" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Etherll/SuperHermes</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Etherll__SuperHermes-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Etherll/SuperHermes
7edd56cb37722d09b0334826e0532b223d334939
26.604602
1
8.03
false
false
false
false
0.750015
0.545902
54.590154
0.528953
32.840317
0.146526
14.652568
0.323826
9.8434
0.440042
14.938542
0.394864
32.762633
false
false
2024-10-27
2024-10-27
1
Etherll/SuperHermes (Merge)
Eurdem_Defne-llama3.1-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Eurdem/Defne-llama3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Eurdem/Defne-llama3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Eurdem__Defne-llama3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Eurdem/Defne-llama3.1-8B
7832ba3066636bf4dab3e7d658c0b3ded12491ae
25.095429
llama3.1
6
8.03
true
false
false
false
1.7203
0.503612
50.361153
0.532098
32.822381
0.15861
15.861027
0.296141
6.152125
0.433094
13.536719
0.386553
31.83917
false
false
2024-07-29
2024-08-14
0
Eurdem/Defne-llama3.1-8B
FINGU-AI_L3-8B_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/FINGU-AI/L3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FINGU-AI/L3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FINGU-AI__L3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FINGU-AI/L3-8B
7e7999af68810a8158bf1cf939b1874d430d51f1
28.889358
llama3.1
2
8.03
true
false
false
true
0.712095
0.751731
75.173096
0.498559
28.805821
0.253021
25.302115
0.295302
6.040268
0.382833
8.6875
0.363946
29.327349
false
false
2025-01-18
2025-01-18
0
FINGU-AI/L3-8B
FINGU-AI_Q-Small-3B_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/FINGU-AI/Q-Small-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FINGU-AI/Q-Small-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FINGU-AI__Q-Small-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FINGU-AI/Q-Small-3B
42ad8458821a8574c3973d7e8088208a32c2fb81
16.663828
apache-2.0
0
3.086
true
false
false
true
0.727329
0.414535
41.453455
0.431853
21.386477
0.069486
6.94864
0.266779
2.237136
0.400542
8.067708
0.279006
19.889554
false
false
2025-01-21
2025-01-21
0
FINGU-AI/Q-Small-3B
FallenMerick_Chewy-Lemon-Cookie-11B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/FallenMerick/Chewy-Lemon-Cookie-11B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FallenMerick/Chewy-Lemon-Cookie-11B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FallenMerick__Chewy-Lemon-Cookie-11B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FallenMerick/Chewy-Lemon-Cookie-11B
0f5d0d6d218b3ef034f58eba32d6fe7ac4c237ae
22.018549
cc-by-4.0
0
10.732
true
false
false
false
0.857274
0.487524
48.752421
0.525112
33.0143
0.05287
5.287009
0.279362
3.914989
0.454552
15.952344
0.326712
25.190233
true
false
2024-06-06
2024-06-27
1
FallenMerick/Chewy-Lemon-Cookie-11B (Merge)
Felladrin_Llama-160M-Chat-v1_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Felladrin/Llama-160M-Chat-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Felladrin/Llama-160M-Chat-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Felladrin__Llama-160M-Chat-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Felladrin/Llama-160M-Chat-v1
e7f50665676821867ee7dfad32d0ca9fb68fc6bc
4.101061
apache-2.0
19
0.162
true
false
false
true
0.181581
0.157546
15.754642
0.303608
3.166756
0
0
0.25755
1.006711
0.366125
3.165625
0.113614
1.512633
false
false
2023-12-20
2024-07-23
1
JackFram/llama-160m
Felladrin_Minueza-32M-UltraChat_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Felladrin/Minueza-32M-UltraChat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Felladrin/Minueza-32M-UltraChat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Felladrin__Minueza-32M-UltraChat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Felladrin/Minueza-32M-UltraChat
28506b99c5902d2215eb378ec91d4226a7396c49
3.848727
apache-2.0
5
0.033
true
false
false
true
0.168067
0.137563
13.756278
0.294148
2.43729
0
0
0.255872
0.782998
0.374187
4.640104
0.113281
1.475694
false
false
2024-02-27
2024-07-23
1
Felladrin/Minueza-32M-Base
FlofloB_100k_fineweb_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit_float16
float16
🟩 continuously pretrained
🟩
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/100k_fineweb_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/100k_fineweb_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__100k_fineweb_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/100k_fineweb_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit
ea6ceae8a6894f1c6ea3fe978846b2a66c3e369c
7.871072
apache-2.0
1
0.5
true
false
false
true
0.483694
0.308322
30.832192
0.332339
7.347825
0
0
0.269295
2.572707
0.330219
0.94401
0.149767
5.529699
false
false
2024-11-28
2024-11-29
3
Qwen/Qwen2.5-0.5B
FlofloB_10k_continued_pretraining_Phi-3-mini-4k-instruct_Unsloth_merged_16bit_float16
float16
🟩 continuously pretrained
🟩
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/10k_continued_pretraining_Phi-3-mini-4k-instruct_Unsloth_merged_16bit" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/10k_continued_pretraining_Phi-3-mini-4k-instruct_Unsloth_merged_16bit</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__10k_continued_pretraining_Phi-3-mini-4k-instruct_Unsloth_merged_16bit-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/10k_continued_pretraining_Phi-3-mini-4k-instruct_Unsloth_merged_16bit
a2eb0460779e76bb511339bcc2545b4729c9d78e
23.879918
apache-2.0
1
16
true
false
false
true
0.487545
0.509731
50.973085
0.521499
32.6078
0.087613
8.761329
0.299497
6.599553
0.430958
13.569792
0.376912
30.767952
false
false
2024-11-22
2024-11-22
1
unsloth/phi-3-mini-4k-instruct-bnb-4bit
FlofloB_10k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit_float16
float16
🟩 continuously pretrained
🟩
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/10k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/10k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__10k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/10k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit
2152657b389375f48fc5073413bba17835117bcc
7.847811
apache-2.0
1
0.5
true
false
false
true
0.508365
0.281544
28.154408
0.330552
7.530229
0
0
0.279362
3.914989
0.330219
1.477344
0.154089
6.0099
false
false
2024-11-25
2024-11-25
3
Qwen/Qwen2.5-0.5B
FlofloB_40k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit_float16
float16
🟩 continuously pretrained
🟩
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/40k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/40k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__40k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/40k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit
64c61d9c777da56597a338afd7586cc4ad07d350
7.827703
apache-2.0
1
0.5
true
false
false
true
0.481567
0.301578
30.157759
0.332461
7.53209
0
0
0.267617
2.348993
0.340823
1.536198
0.148521
5.391179
false
false
2024-11-25
2024-11-25
3
Qwen/Qwen2.5-0.5B
FlofloB_83k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit_float16
float16
🟩 continuously pretrained
🟩
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/83k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/83k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__83k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/83k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit
4c4d3660d0288295f89880a3a86f4eb9ecc9d344
7.923936
apache-2.0
2
0.5
true
false
false
true
0.492186
0.28694
28.693976
0.334653
8.132273
0
0
0.27349
3.131991
0.328948
1.41849
0.155502
6.166888
false
false
2024-11-26
2024-11-26
3
Qwen/Qwen2.5-0.5B
FlofloB_smollm2-135M_pretrained_1000k_fineweb_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_1000k_fineweb" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_1000k_fineweb</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_1000k_fineweb-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/smollm2-135M_pretrained_1000k_fineweb
a0f91cfda4e5a820dbe30bd5e3fbb8f233f7467e
4.056751
apache-2.0
0
0.135
true
false
false
false
0.337909
0.148454
14.845388
0.291794
2.708744
0
0
0.262584
1.677852
0.358062
3.291146
0.116356
1.817376
false
false
2025-01-11
2025-01-14
5
HuggingFaceTB/SmolLM2-135M
FlofloB_smollm2-135M_pretrained_1000k_fineweb_uncovai_human_removed_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_1000k_fineweb_uncovai_human_removed" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_1000k_fineweb_uncovai_human_removed</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_1000k_fineweb_uncovai_human_removed-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/smollm2-135M_pretrained_1000k_fineweb_uncovai_human_removed
73ba3da387b3bdc50d6e3594c5c89ddebb271e81
3.960645
apache-2.0
0
0.135
true
false
false
false
0.339105
0.155373
15.53733
0.306643
3.274267
0
0
0.250839
0.111857
0.358031
3.253906
0.114279
1.58651
false
false
2025-01-24
2025-01-27
5
HuggingFaceTB/SmolLM2-135M
FlofloB_smollm2-135M_pretrained_1000k_fineweb_uncovai_selected_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_1000k_fineweb_uncovai_selected" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_1000k_fineweb_uncovai_selected</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_1000k_fineweb_uncovai_selected-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/smollm2-135M_pretrained_1000k_fineweb_uncovai_selected
e2115c3c7315400cb6338465672087c457b157ac
4.94255
apache-2.0
0
0.135
true
false
false
false
0.334808
0.146781
14.678054
0.293178
2.113414
0
0
0.26594
2.12528
0.40476
8.995052
0.115691
1.743499
false
false
2025-01-12
2025-01-12
5
HuggingFaceTB/SmolLM2-135M
FlofloB_smollm2-135M_pretrained_1200k_fineweb_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_1200k_fineweb" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_1200k_fineweb</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_1200k_fineweb-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/smollm2-135M_pretrained_1200k_fineweb
d886605e0d45787f492f628fd0ea72c27f205f83
4.075019
apache-2.0
0
0.135
true
false
false
false
0.33538
0.158096
15.809607
0.294098
2.237296
0
0
0.264262
1.901566
0.371365
3.653906
0.10763
0.847739
false
false
2025-01-12
2025-01-12
6
HuggingFaceTB/SmolLM2-135M
FlofloB_smollm2-135M_pretrained_1200k_fineweb_uncovai_human_removed_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_1200k_fineweb_uncovai_human_removed" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_1200k_fineweb_uncovai_human_removed</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_1200k_fineweb_uncovai_human_removed-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/smollm2-135M_pretrained_1200k_fineweb_uncovai_human_removed
d743033d6f0048af31089e1133de7cee8b1e83f5
4.267703
apache-2.0
0
0.135
true
false
false
false
0.336077
0.157771
15.777138
0.294962
2.849419
0
0
0.265101
2.013423
0.37
3.416667
0.113946
1.549572
false
false
2025-01-27
2025-01-27
6
HuggingFaceTB/SmolLM2-135M
FlofloB_smollm2-135M_pretrained_1200k_fineweb_uncovai_selected_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_1200k_fineweb_uncovai_selected" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_1200k_fineweb_uncovai_selected</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_1200k_fineweb_uncovai_selected-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/smollm2-135M_pretrained_1200k_fineweb_uncovai_selected
8c05c5b2f00c84d4120b3221c81c1f481c585768
3.904624
apache-2.0
0
0.135
true
false
false
false
0.335295
0.158471
15.847064
0.296047
2.206545
0
0
0.263423
1.789709
0.356729
1.757812
0.116439
1.826611
false
false
2025-01-12
2025-01-14
6
HuggingFaceTB/SmolLM2-135M
FlofloB_smollm2-135M_pretrained_1400k_fineweb_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_1400k_fineweb" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_1400k_fineweb</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_1400k_fineweb-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/smollm2-135M_pretrained_1400k_fineweb
a9c59a43cf0da87ad05ec8bd4a4c75d22c2e367c
4.804136
apache-2.0
0
0.135
true
false
false
false
0.344046
0.176381
17.638089
0.292178
2.1601
0
0
0.26594
2.12528
0.387333
6.016667
0.107962
0.884678
false
false
2025-01-13
2025-01-13
7
HuggingFaceTB/SmolLM2-135M
FlofloB_smollm2-135M_pretrained_1400k_fineweb_uncovai_human_removed_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_1400k_fineweb_uncovai_human_removed" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_1400k_fineweb_uncovai_human_removed</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_1400k_fineweb_uncovai_human_removed-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/smollm2-135M_pretrained_1400k_fineweb_uncovai_human_removed
f2851eedb367100fa0ca50ed25ff610a83713de2
4.886798
apache-2.0
0
0.135
true
false
false
false
0.344123
0.170661
17.066051
0.299239
2.630029
0
0
0.260906
1.454139
0.393938
7.008854
0.110455
1.161717
false
false
2025-01-28
2025-01-28
7
HuggingFaceTB/SmolLM2-135M
FlofloB_smollm2-135M_pretrained_1400k_fineweb_uncovai_selected_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_1400k_fineweb_uncovai_selected" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_1400k_fineweb_uncovai_selected</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_1400k_fineweb_uncovai_selected-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/smollm2-135M_pretrained_1400k_fineweb_uncovai_selected
098a8e666d272a8cb4863b0877b6f4507e1c230c
4.448406
apache-2.0
0
0.135
true
false
false
false
0.337709
0.15385
15.384956
0.291673
2.631616
0
0
0.268456
2.46085
0.374062
4.691146
0.113697
1.521868
false
false
2025-01-13
2025-01-13
7
HuggingFaceTB/SmolLM2-135M
FlofloB_smollm2-135M_pretrained_200k_fineweb_uncovai_human_removed_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_200k_fineweb_uncovai_human_removed" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_200k_fineweb_uncovai_human_removed</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_200k_fineweb_uncovai_human_removed-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/smollm2-135M_pretrained_200k_fineweb_uncovai_human_removed
4bacfcaa1040d1cba93da123ce57749bf2ed5e82
3.819027
apache-2.0
0
0.135
true
false
false
false
0.333205
0.14748
14.74798
0.302874
2.82254
0
0
0.258389
1.118568
0.357844
2.897135
0.111951
1.32794
false
false
2025-01-17
2025-01-17
1
HuggingFaceTB/SmolLM2-135M
FlofloB_smollm2-135M_pretrained_200k_fineweb_uncovai_selected_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_200k_fineweb_uncovai_selected" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_200k_fineweb_uncovai_selected</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_200k_fineweb_uncovai_selected-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/smollm2-135M_pretrained_200k_fineweb_uncovai_selected
381cdec29375aeaf0fb1bcc8ab2218443fc1cadd
3.366145
apache-2.0
1
0.135
true
false
false
false
0.341145
0.134515
13.451531
0.292719
2.322352
0
0
0.250839
0.111857
0.366031
2.853906
0.113115
1.457225
false
false
2025-01-08
2025-01-08
1
HuggingFaceTB/SmolLM2-135M
FlofloB_smollm2-135M_pretrained_400k_fineweb_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_400k_fineweb" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_400k_fineweb</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_400k_fineweb-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/smollm2-135M_pretrained_400k_fineweb
2601cf93307104afc3f57f467323f5368567cb74
4.023535
apache-2.0
0
0.135
true
false
false
false
0.345628
0.151127
15.112679
0.297234
1.889766
0
0
0.252517
0.33557
0.379427
4.995052
0.116273
1.808141
false
false
2025-01-09
2025-01-10
2
HuggingFaceTB/SmolLM2-135M
FlofloB_smollm2-135M_pretrained_400k_fineweb_uncovai_human_removed_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_400k_fineweb_uncovai_human_removed" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_400k_fineweb_uncovai_human_removed</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_400k_fineweb_uncovai_human_removed-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/smollm2-135M_pretrained_400k_fineweb_uncovai_human_removed
c99f5022db1982d463626b4d87c7aeeff519b3fa
4.559869
apache-2.0
0
0.135
true
false
false
false
0.339668
0.155648
15.564812
0.30488
3.575492
0
0
0.255034
0.671141
0.386
6.016667
0.11378
1.531102
false
false
2025-01-18
2025-01-18
2
HuggingFaceTB/SmolLM2-135M
FlofloB_smollm2-135M_pretrained_400k_fineweb_uncovai_selected_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_400k_fineweb_uncovai_selected" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_400k_fineweb_uncovai_selected</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_400k_fineweb_uncovai_selected-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/smollm2-135M_pretrained_400k_fineweb_uncovai_selected
ecac44607d60c294b460a8786f6253d561f3de85
4.274038
apache-2.0
1
0.135
true
false
false
false
0.335765
0.158421
15.842077
0.292517
2.073466
0
0
0.254195
0.559284
0.382
5.416667
0.115775
1.752733
false
false
2025-01-09
2025-01-09
2
HuggingFaceTB/SmolLM2-135M
FlofloB_smollm2-135M_pretrained_600k_fineweb_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_600k_fineweb" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_600k_fineweb</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_600k_fineweb-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/smollm2-135M_pretrained_600k_fineweb
6922498cf15ce9558b8ad2c33fc43106628d0cec
4.786034
apache-2.0
0
0.135
true
false
false
false
0.337077
0.163916
16.391619
0.301372
3.424053
0
0
0.26594
2.12528
0.380854
5.373437
0.112616
1.401817
false
false
2025-01-10
2025-01-11
3
HuggingFaceTB/SmolLM2-135M
FlofloB_smollm2-135M_pretrained_600k_fineweb_uncovai_human_removed_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_600k_fineweb_uncovai_human_removed" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_600k_fineweb_uncovai_human_removed</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_600k_fineweb_uncovai_human_removed-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/smollm2-135M_pretrained_600k_fineweb_uncovai_human_removed
02a7c39af8a00dbd0ffa449cd830cf57261246b3
4.493344
apache-2.0
0
0.135
true
false
false
false
0.333867
0.164141
16.414115
0.300017
2.418749
0
0
0.262584
1.677852
0.379333
4.816667
0.114694
1.632683
false
false
2025-01-18
2025-01-19
3
HuggingFaceTB/SmolLM2-135M
FlofloB_smollm2-135M_pretrained_600k_fineweb_uncovai_selected_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_600k_fineweb_uncovai_selected" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_600k_fineweb_uncovai_selected</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_600k_fineweb_uncovai_selected-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/smollm2-135M_pretrained_600k_fineweb_uncovai_selected
66e4931a5409bb8739522ff5df3b4f3373738fad
4.531725
apache-2.0
0
0.135
true
false
false
false
0.337788
0.160594
16.059389
0.298344
2.165156
0
0
0.260906
1.454139
0.384635
5.71276
0.11619
1.798907
false
false
2025-01-09
2025-01-09
3
HuggingFaceTB/SmolLM2-135M
FlofloB_smollm2-135M_pretrained_800k_fineweb_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_800k_fineweb" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_800k_fineweb</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_800k_fineweb-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/smollm2-135M_pretrained_800k_fineweb
066f4d48c5f6d83ac9a44e8572a3d20c74f6ec08
4.036036
apache-2.0
0
0.135
true
false
false
false
0.336331
0.164141
16.414115
0.295944
2.348388
0
0
0.249161
0
0.370125
3.765625
0.115193
1.688091
false
false
2025-01-11
2025-01-14
4
HuggingFaceTB/SmolLM2-135M
FlofloB_smollm2-135M_pretrained_800k_fineweb_uncovai_human_removed_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_800k_fineweb_uncovai_human_removed" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_800k_fineweb_uncovai_human_removed</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_800k_fineweb_uncovai_human_removed-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/smollm2-135M_pretrained_800k_fineweb_uncovai_human_removed
60c100113d77cced9b284172608f100297183ac9
4.91925
apache-2.0
0
0.135
true
false
false
false
0.334203
0.162293
16.229272
0.30381
3.210703
0
0
0.252517
0.33557
0.399271
8.208854
0.11378
1.531102
false
false
2025-01-19
2025-01-19
4
HuggingFaceTB/SmolLM2-135M
FlofloB_smollm2-135M_pretrained_800k_fineweb_uncovai_selected_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_800k_fineweb_uncovai_selected" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_800k_fineweb_uncovai_selected</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_800k_fineweb_uncovai_selected-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/smollm2-135M_pretrained_800k_fineweb_uncovai_selected
7b351540b5fb395759e44385826c5fedef8672ec
4.043211
apache-2.0
0
0.135
true
false
false
false
0.335119
0.14743
14.742993
0.294281
1.922858
0
0
0.261745
1.565996
0.376635
4.579427
0.113032
1.447991
false
false
2025-01-11
2025-01-14
4
HuggingFaceTB/SmolLM2-135M
FlofloB_smollm2_pretrained_200k_fineweb_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2_pretrained_200k_fineweb" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2_pretrained_200k_fineweb</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2_pretrained_200k_fineweb-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/smollm2_pretrained_200k_fineweb
c3086ab3555e766f0b3903b8b9a1a290e3e25f3d
3.942659
apache-2.0
1
0.135
true
false
false
false
0.329732
0.1527
15.270039
0.299468
2.872523
0
0
0.247483
0
0.369938
3.742187
0.115941
1.771203
false
false
2025-01-08
2025-01-08
1
HuggingFaceTB/SmolLM2-135M
FlofloB_test_continued_pretraining_Phi-3-mini-4k-instruct_Unsloth_merged_16bit_float16
float16
🟩 continuously pretrained
🟩
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/test_continued_pretraining_Phi-3-mini-4k-instruct_Unsloth_merged_16bit" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/test_continued_pretraining_Phi-3-mini-4k-instruct_Unsloth_merged_16bit</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__test_continued_pretraining_Phi-3-mini-4k-instruct_Unsloth_merged_16bit-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/test_continued_pretraining_Phi-3-mini-4k-instruct_Unsloth_merged_16bit
cfd97ca5927a2e09ec30001a576d82dd8b635e09
24.460526
apache-2.0
1
16
true
false
false
true
1.008801
0.521546
52.154616
0.524083
32.882433
0.108761
10.876133
0.311242
8.165548
0.424417
12.452083
0.372091
30.232343
false
false
2024-11-21
2024-11-21
1
unsloth/phi-3-mini-4k-instruct-bnb-4bit
FuJhen_ft-openhermes-25-mistral-7b-irca-dpo-pairs_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Adapter
?
<a target="_blank" href="https://huggingface.co/FuJhen/ft-openhermes-25-mistral-7b-irca-dpo-pairs" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FuJhen/ft-openhermes-25-mistral-7b-irca-dpo-pairs</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FuJhen__ft-openhermes-25-mistral-7b-irca-dpo-pairs-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FuJhen/ft-openhermes-25-mistral-7b-irca-dpo-pairs
24c0bea14d53e6f67f1fbe2eca5bfe7cae389b33
19.615525
apache-2.0
0
14.483
true
false
false
true
1.002048
0.542004
54.20041
0.477303
26.596861
0.001511
0.151057
0.278523
3.803132
0.417375
11.205208
0.295628
21.73648
false
false
2024-09-12
2024-09-12
1
FuJhen/ft-openhermes-25-mistral-7b-irca-dpo-pairs (Merge)
FuJhen_mistral-instruct-7B-DPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Adapter
?
<a target="_blank" href="https://huggingface.co/FuJhen/mistral-instruct-7B-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FuJhen/mistral-instruct-7B-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FuJhen__mistral-instruct-7B-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FuJhen/mistral-instruct-7B-DPO
e0bc86c23ce5aae1db576c8cca6f06f1f73af2db
19.016943
apache-2.0
0
14.496
true
false
false
true
1.009647
0.496842
49.684171
0.462391
24.925827
0.037764
3.776435
0.277685
3.691275
0.401563
9.428646
0.303358
22.595301
false
false
2024-09-12
2024-09-12
1
FuJhen/mistral-instruct-7B-DPO (Merge)
FuJhen_mistral_7b_v0.1_structedData_e2e_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/FuJhen/mistral_7b_v0.1_structedData_e2e" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FuJhen/mistral_7b_v0.1_structedData_e2e</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FuJhen__mistral_7b_v0.1_structedData_e2e-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FuJhen/mistral_7b_v0.1_structedData_e2e
7231864981174d9bee8c7687c24c8344414eae6b
10.871547
apache-2.0
0
7
true
false
false
false
1.080246
0.172684
17.268403
0.411391
18.062424
0.002266
0.226586
0.279362
3.914989
0.372292
5.636458
0.281084
20.12042
false
false
2024-09-13
2024-09-13
1
FuJhen/mistral_7b_v0.1_structedData_e2e (Merge)
FuJhen_mistral_7b_v0.1_structedData_viggo_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/FuJhen/mistral_7b_v0.1_structedData_viggo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FuJhen/mistral_7b_v0.1_structedData_viggo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FuJhen__mistral_7b_v0.1_structedData_viggo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FuJhen/mistral_7b_v0.1_structedData_viggo
7231864981174d9bee8c7687c24c8344414eae6b
12.352466
apache-2.0
0
14.483
true
false
false
false
1.076114
0.178329
17.832906
0.452386
23.960172
0.023414
2.34139
0.283557
4.474273
0.373813
3.926563
0.294215
21.579492
false
false
2024-09-13
2024-09-13
1
FuJhen/mistral_7b_v0.1_structedData_viggo (Merge)
FuseAI_FuseChat-7B-v2.0_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/FuseAI/FuseChat-7B-v2.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FuseAI/FuseChat-7B-v2.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FuseAI__FuseChat-7B-v2.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FuseAI/FuseChat-7B-v2.0
65fdb310c09f56b9aca01b89a849f06f39faeb75
20.184132
apache-2.0
9
7.242
true
false
false
false
0.443306
0.342319
34.231949
0.495421
29.341638
0.063444
6.344411
0.302013
6.935123
0.479667
20.225
0.31624
24.02667
false
false
2024-08-13
2024-11-21
1
openchat/openchat_3.5
FuseAI_FuseChat-Llama-3.1-8B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/FuseAI/FuseChat-Llama-3.1-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FuseAI/FuseChat-Llama-3.1-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FuseAI__FuseChat-Llama-3.1-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FuseAI/FuseChat-Llama-3.1-8B-Instruct
cbb3accdd01a81194e947dfde1b95707db67f2b7
25.637302
9
8.03
false
false
false
true
0.670215
0.720482
72.048166
0.511989
30.848065
0.070242
7.024169
0.305369
7.38255
0.382
6.15
0.373338
30.370863
false
false
2024-11-20
2025-01-07
0
FuseAI/FuseChat-Llama-3.1-8B-Instruct
FuseAI_FuseChat-Qwen-2.5-7B-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/FuseAI/FuseChat-Qwen-2.5-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FuseAI/FuseChat-Qwen-2.5-7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FuseAI__FuseChat-Qwen-2.5-7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FuseAI/FuseChat-Qwen-2.5-7B-Instruct
7735ee1acb31112cf93c35e8e22e764ad27cce3b
23.804493
9
7.616
true
false
false
true
0.654559
0.590564
59.056415
0.5526
36.251348
0
0
0.296141
6.152125
0.387365
6.720573
0.411818
34.646498
false
false
2024-11-12
2024-12-21
0
FuseAI/FuseChat-Qwen-2.5-7B-Instruct