eval_name
stringlengths 12
111
| Precision
stringclasses 3
values | Type
stringclasses 6
values | T
stringclasses 6
values | Weight type
stringclasses 2
values | Architecture
stringclasses 52
values | Model
stringlengths 355
689
| fullname
stringlengths 4
102
| Model sha
stringlengths 0
40
| Average ⬆️
float64 1.03
52
| Hub License
stringclasses 26
values | Hub ❤️
int64 0
5.9k
| #Params (B)
int64 -1
140
| Available on the hub
bool 2
classes | MoE
bool 2
classes | Flagged
bool 2
classes | Chat Template
bool 2
classes | CO₂ cost (kg)
float64 0.03
107
| IFEval Raw
float64 0
0.9
| IFEval
float64 0
90
| BBH Raw
float64 0.27
0.75
| BBH
float64 0.81
63.5
| MATH Lvl 5 Raw
float64 0
0.51
| MATH Lvl 5
float64 0
50.7
| GPQA Raw
float64 0.22
0.44
| GPQA
float64 0
24.9
| MUSR Raw
float64 0.29
0.6
| MUSR
float64 0
38.5
| MMLU-PRO Raw
float64 0.1
0.73
| MMLU-PRO
float64 0
70
| Merged
bool 2
classes | Official Providers
bool 2
classes | Upload To Hub Date
stringclasses 424
values | Submission Date
stringclasses 169
values | Generation
int64 0
10
| Base Model
stringlengths 4
102
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
FlofloB_83k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit_float16 | float16 | 🟩 continuously pretrained | 🟩 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/FlofloB/83k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/83k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__83k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | FlofloB/83k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit | 4c4d3660d0288295f89880a3a86f4eb9ecc9d344 | 7.923936 | apache-2.0 | 1 | 0 | true | false | false | true | 0.492186 | 0.28694 | 28.693976 | 0.334653 | 8.132273 | 0 | 0 | 0.27349 | 3.131991 | 0.328948 | 1.41849 | 0.155502 | 6.166888 | false | false | 2024-11-26 | 2024-11-26 | 3 | Qwen/Qwen2.5-0.5B |
FlofloB_test_continued_pretraining_Phi-3-mini-4k-instruct_Unsloth_merged_16bit_float16 | float16 | 🟩 continuously pretrained | 🟩 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/FlofloB/test_continued_pretraining_Phi-3-mini-4k-instruct_Unsloth_merged_16bit" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/test_continued_pretraining_Phi-3-mini-4k-instruct_Unsloth_merged_16bit</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__test_continued_pretraining_Phi-3-mini-4k-instruct_Unsloth_merged_16bit-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | FlofloB/test_continued_pretraining_Phi-3-mini-4k-instruct_Unsloth_merged_16bit | cfd97ca5927a2e09ec30001a576d82dd8b635e09 | 24.460526 | apache-2.0 | 1 | 16 | true | false | false | true | 1.008801 | 0.521546 | 52.154616 | 0.524083 | 32.882433 | 0.108761 | 10.876133 | 0.311242 | 8.165548 | 0.424417 | 12.452083 | 0.372091 | 30.232343 | false | false | 2024-11-21 | 2024-11-21 | 1 | unsloth/phi-3-mini-4k-instruct-bnb-4bit |
FuJhen_ft-openhermes-25-mistral-7b-irca-dpo-pairs_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Adapter | ? | <a target="_blank" href="https://huggingface.co/FuJhen/ft-openhermes-25-mistral-7b-irca-dpo-pairs" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FuJhen/ft-openhermes-25-mistral-7b-irca-dpo-pairs</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FuJhen__ft-openhermes-25-mistral-7b-irca-dpo-pairs-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | FuJhen/ft-openhermes-25-mistral-7b-irca-dpo-pairs | 24c0bea14d53e6f67f1fbe2eca5bfe7cae389b33 | 19.615525 | apache-2.0 | 0 | 14 | true | false | false | true | 1.002048 | 0.542004 | 54.20041 | 0.477303 | 26.596861 | 0.001511 | 0.151057 | 0.278523 | 3.803132 | 0.417375 | 11.205208 | 0.295628 | 21.73648 | false | false | 2024-09-12 | 2024-09-12 | 1 | FuJhen/ft-openhermes-25-mistral-7b-irca-dpo-pairs (Merge) |
FuJhen_mistral-instruct-7B-DPO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Adapter | ? | <a target="_blank" href="https://huggingface.co/FuJhen/mistral-instruct-7B-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FuJhen/mistral-instruct-7B-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FuJhen__mistral-instruct-7B-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | FuJhen/mistral-instruct-7B-DPO | e0bc86c23ce5aae1db576c8cca6f06f1f73af2db | 19.016943 | apache-2.0 | 0 | 14 | true | false | false | true | 1.009647 | 0.496842 | 49.684171 | 0.462391 | 24.925827 | 0.037764 | 3.776435 | 0.277685 | 3.691275 | 0.401563 | 9.428646 | 0.303358 | 22.595301 | false | false | 2024-09-12 | 2024-09-12 | 1 | FuJhen/mistral-instruct-7B-DPO (Merge) |
FuJhen_mistral_7b_v0.1_structedData_e2e_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Adapter | ? | <a target="_blank" href="https://huggingface.co/FuJhen/mistral_7b_v0.1_structedData_e2e" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FuJhen/mistral_7b_v0.1_structedData_e2e</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FuJhen__mistral_7b_v0.1_structedData_e2e-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | FuJhen/mistral_7b_v0.1_structedData_e2e | 7231864981174d9bee8c7687c24c8344414eae6b | 10.871547 | apache-2.0 | 0 | 7 | true | false | false | false | 1.080246 | 0.172684 | 17.268403 | 0.411391 | 18.062424 | 0.002266 | 0.226586 | 0.279362 | 3.914989 | 0.372292 | 5.636458 | 0.281084 | 20.12042 | false | false | 2024-09-13 | 2024-09-13 | 1 | FuJhen/mistral_7b_v0.1_structedData_e2e (Merge) |
FuJhen_mistral_7b_v0.1_structedData_viggo_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Adapter | ? | <a target="_blank" href="https://huggingface.co/FuJhen/mistral_7b_v0.1_structedData_viggo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FuJhen/mistral_7b_v0.1_structedData_viggo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FuJhen__mistral_7b_v0.1_structedData_viggo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | FuJhen/mistral_7b_v0.1_structedData_viggo | 7231864981174d9bee8c7687c24c8344414eae6b | 12.352466 | apache-2.0 | 0 | 14 | true | false | false | false | 1.076114 | 0.178329 | 17.832906 | 0.452386 | 23.960172 | 0.023414 | 2.34139 | 0.283557 | 4.474273 | 0.373813 | 3.926563 | 0.294215 | 21.579492 | false | false | 2024-09-13 | 2024-09-13 | 1 | FuJhen/mistral_7b_v0.1_structedData_viggo (Merge) |
FuseAI_FuseChat-7B-v2.0_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/FuseAI/FuseChat-7B-v2.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FuseAI/FuseChat-7B-v2.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FuseAI__FuseChat-7B-v2.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | FuseAI/FuseChat-7B-v2.0 | 65fdb310c09f56b9aca01b89a849f06f39faeb75 | 20.184132 | apache-2.0 | 9 | 7 | true | false | false | false | 0.443306 | 0.342319 | 34.231949 | 0.495421 | 29.341638 | 0.063444 | 6.344411 | 0.302013 | 6.935123 | 0.479667 | 20.225 | 0.31624 | 24.02667 | false | false | 2024-08-13 | 2024-11-21 | 1 | openchat/openchat_3.5 |
GalrionSoftworks_MN-LooseCannon-12B-v1_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/GalrionSoftworks/MN-LooseCannon-12B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">GalrionSoftworks/MN-LooseCannon-12B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/GalrionSoftworks__MN-LooseCannon-12B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | GalrionSoftworks/MN-LooseCannon-12B-v1 | 21.885253 | 8 | 12 | false | false | false | true | 1.52902 | 0.541779 | 54.177915 | 0.512818 | 29.976062 | 0.070997 | 7.099698 | 0.285235 | 4.697987 | 0.413844 | 10.963802 | 0.319564 | 24.396055 | false | false | 2024-08-09 | 2024-09-05 | 1 | GalrionSoftworks/MN-LooseCannon-12B-v1 (Merge) |
||
GalrionSoftworks_MagnusIntellectus-12B-v1_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/GalrionSoftworks/MagnusIntellectus-12B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">GalrionSoftworks/MagnusIntellectus-12B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/GalrionSoftworks__MagnusIntellectus-12B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | GalrionSoftworks/MagnusIntellectus-12B-v1 | fc83cb3eec2f8328448c5fe3cb830fc77983a6b9 | 21.622238 | apache-2.0 | 5 | 12 | true | false | false | true | 1.624264 | 0.442137 | 44.213686 | 0.532301 | 33.262254 | 0.055891 | 5.589124 | 0.284396 | 4.58613 | 0.442802 | 15.183594 | 0.342088 | 26.898641 | true | false | 2024-08-13 | 2024-09-05 | 1 | GalrionSoftworks/MagnusIntellectus-12B-v1 (Merge) |
GoToCompany_gemma2-9b-cpt-sahabatai-v1-instruct_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/GoToCompany/gemma2-9b-cpt-sahabatai-v1-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">GoToCompany/gemma2-9b-cpt-sahabatai-v1-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/GoToCompany__gemma2-9b-cpt-sahabatai-v1-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | GoToCompany/gemma2-9b-cpt-sahabatai-v1-instruct | ca19cec82a7d2bdba20020e1bebf296417cfc3ee | 32.342379 | gemma | 26 | 9 | true | false | false | false | 1.931095 | 0.655061 | 65.506079 | 0.595455 | 41.866504 | 0.197885 | 19.78852 | 0.334732 | 11.297539 | 0.477865 | 19.333073 | 0.426363 | 36.262559 | false | false | 2024-11-06 | 2024-11-20 | 1 | GoToCompany/gemma2-9b-cpt-sahabatai-v1-instruct (Merge) |
GoToCompany_llama3-8b-cpt-sahabatai-v1-instruct_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/GoToCompany/llama3-8b-cpt-sahabatai-v1-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">GoToCompany/llama3-8b-cpt-sahabatai-v1-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/GoToCompany__llama3-8b-cpt-sahabatai-v1-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | GoToCompany/llama3-8b-cpt-sahabatai-v1-instruct | 20fd3cff1dc86553d11b5c4b2fdbb6f2dd1ede55 | 22.908342 | llama3 | 6 | 8 | true | false | false | true | 0.673411 | 0.523845 | 52.384451 | 0.495129 | 28.539529 | 0.11858 | 11.858006 | 0.266779 | 2.237136 | 0.448844 | 15.172135 | 0.345329 | 27.258791 | false | false | 2024-11-06 | 2024-11-20 | 1 | GoToCompany/llama3-8b-cpt-sahabatai-v1-instruct (Merge) |
Goekdeniz-Guelmez_Josiefied-Qwen2.5-0.5B-Instruct-abliterated-v1_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/Goekdeniz-Guelmez/Josiefied-Qwen2.5-0.5B-Instruct-abliterated-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Goekdeniz-Guelmez/Josiefied-Qwen2.5-0.5B-Instruct-abliterated-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Goekdeniz-Guelmez__Josiefied-Qwen2.5-0.5B-Instruct-abliterated-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Goekdeniz-Guelmez/Josiefied-Qwen2.5-0.5B-Instruct-abliterated-v1 | bfc0e7dc6add02baecd9b6f84a078f7f3d164315 | 8.320603 | apache-2.0 | 1 | 0 | true | false | false | true | 0.487642 | 0.34719 | 34.71899 | 0.326831 | 6.845786 | 0.002266 | 0.226586 | 0.251678 | 0.223714 | 0.32625 | 0.78125 | 0.164146 | 7.12729 | false | false | 2024-11-17 | 2024-11-18 | 2 | Qwen/Qwen2.5-0.5B |
Goekdeniz-Guelmez_Josiefied-Qwen2.5-0.5B-Instruct-abliterated-v1_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/Goekdeniz-Guelmez/Josiefied-Qwen2.5-0.5B-Instruct-abliterated-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Goekdeniz-Guelmez/Josiefied-Qwen2.5-0.5B-Instruct-abliterated-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Goekdeniz-Guelmez__Josiefied-Qwen2.5-0.5B-Instruct-abliterated-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Goekdeniz-Guelmez/Josiefied-Qwen2.5-0.5B-Instruct-abliterated-v1 | bfc0e7dc6add02baecd9b6f84a078f7f3d164315 | 8.415919 | apache-2.0 | 1 | 0 | true | false | false | true | 0.498004 | 0.341694 | 34.169448 | 0.32921 | 7.221169 | 0.002266 | 0.226586 | 0.25755 | 1.006711 | 0.324917 | 0.78125 | 0.163813 | 7.090352 | false | false | 2024-11-17 | 2024-11-18 | 2 | Qwen/Qwen2.5-0.5B |
Goekdeniz-Guelmez_Josiefied-Qwen2.5-1.5B-Instruct-abliterated-v1_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/Goekdeniz-Guelmez/Josiefied-Qwen2.5-1.5B-Instruct-abliterated-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Goekdeniz-Guelmez/Josiefied-Qwen2.5-1.5B-Instruct-abliterated-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Goekdeniz-Guelmez__Josiefied-Qwen2.5-1.5B-Instruct-abliterated-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Goekdeniz-Guelmez/Josiefied-Qwen2.5-1.5B-Instruct-abliterated-v1 | eca7edeba61e894597e9940348e8d90817c1ad79 | 15.294146 | apache-2.0 | 4 | 1 | true | false | false | true | 0.783381 | 0.476858 | 47.685807 | 0.418601 | 18.306013 | 0.019637 | 1.963746 | 0.243289 | 0 | 0.36749 | 4.002865 | 0.278258 | 19.806442 | false | false | 2024-09-20 | 2024-09-28 | 1 | Qwen/Qwen2.5-1.5B |
Goekdeniz-Guelmez_Josiefied-Qwen2.5-1.5B-Instruct-abliterated-v2_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/Goekdeniz-Guelmez/Josiefied-Qwen2.5-1.5B-Instruct-abliterated-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Goekdeniz-Guelmez/Josiefied-Qwen2.5-1.5B-Instruct-abliterated-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Goekdeniz-Guelmez__Josiefied-Qwen2.5-1.5B-Instruct-abliterated-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Goekdeniz-Guelmez/Josiefied-Qwen2.5-1.5B-Instruct-abliterated-v2 | ff4a6eff69adb015dfcfbff7a2d2dc43b34afe89 | 13.665944 | apache-2.0 | 1 | 1 | true | false | false | true | 0.719243 | 0.421554 | 42.15537 | 0.404189 | 16.499503 | 0.01284 | 1.283988 | 0.239933 | 0 | 0.376854 | 4.706771 | 0.25615 | 17.35003 | false | false | 2024-09-28 | 2024-09-28 | 2 | Qwen/Qwen2.5-1.5B |
Goekdeniz-Guelmez_Josiefied-Qwen2.5-1.5B-Instruct-abliterated-v3_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/Goekdeniz-Guelmez/Josiefied-Qwen2.5-1.5B-Instruct-abliterated-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Goekdeniz-Guelmez/Josiefied-Qwen2.5-1.5B-Instruct-abliterated-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Goekdeniz-Guelmez__Josiefied-Qwen2.5-1.5B-Instruct-abliterated-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Goekdeniz-Guelmez/Josiefied-Qwen2.5-1.5B-Instruct-abliterated-v3 | 03ffa6f7a6ada9d63d838707c597297f048d409b | 13.540924 | apache-2.0 | 1 | 1 | true | false | false | true | 0.706201 | 0.425251 | 42.525056 | 0.405345 | 16.439712 | 0.007553 | 0.755287 | 0.243289 | 0 | 0.370187 | 4.240104 | 0.255568 | 17.285387 | false | false | 2024-09-28 | 2024-09-28 | 3 | Qwen/Qwen2.5-1.5B |
Goekdeniz-Guelmez_Josiefied-Qwen2.5-14B-Instruct-abliterated-v4_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/Goekdeniz-Guelmez/Josiefied-Qwen2.5-14B-Instruct-abliterated-v4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Goekdeniz-Guelmez/Josiefied-Qwen2.5-14B-Instruct-abliterated-v4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Goekdeniz-Guelmez__Josiefied-Qwen2.5-14B-Instruct-abliterated-v4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Goekdeniz-Guelmez/Josiefied-Qwen2.5-14B-Instruct-abliterated-v4 | 00afd27eef16e835fcb0d8e687435dca3c185bdf | 33.511798 | apache-2.0 | 12 | 14 | true | false | false | true | 1.747117 | 0.829167 | 82.916661 | 0.635564 | 48.05227 | 0 | 0 | 0.342282 | 12.304251 | 0.428667 | 13.15 | 0.501828 | 44.647606 | false | false | 2024-10-21 | 2024-10-23 | 2 | Qwen/Qwen2.5-14B |
Goekdeniz-Guelmez_Josiefied-Qwen2.5-7B-Instruct-abliterated-v2_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/Goekdeniz-Guelmez/Josiefied-Qwen2.5-7B-Instruct-abliterated-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Goekdeniz-Guelmez/Josiefied-Qwen2.5-7B-Instruct-abliterated-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Goekdeniz-Guelmez__Josiefied-Qwen2.5-7B-Instruct-abliterated-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Goekdeniz-Guelmez/Josiefied-Qwen2.5-7B-Instruct-abliterated-v2 | ecf4024048ea1be2f0840a50080fb79b88aacde9 | 27.763763 | apache-2.0 | 4 | 7 | true | false | false | true | 1.201506 | 0.781381 | 78.138118 | 0.530967 | 33.333986 | 0 | 0 | 0.298658 | 6.487696 | 0.435396 | 13.957813 | 0.411985 | 34.664967 | false | false | 2024-09-20 | 2024-10-08 | 1 | Qwen/Qwen2.5-7B |
Goekdeniz-Guelmez_j.o.s.i.e.v4o-1.5b-dpo-stage1-v1_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/Goekdeniz-Guelmez/j.o.s.i.e.v4o-1.5b-dpo-stage1-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Goekdeniz-Guelmez/j.o.s.i.e.v4o-1.5b-dpo-stage1-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Goekdeniz-Guelmez__j.o.s.i.e.v4o-1.5b-dpo-stage1-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Goekdeniz-Guelmez/j.o.s.i.e.v4o-1.5b-dpo-stage1-v1 | d5ddad290d83b1ba8a7612a6c1cfad6fb4346fe4 | 13.567474 | apache-2.0 | 1 | 1 | true | false | false | true | 0.791153 | 0.418831 | 41.883092 | 0.412421 | 17.748017 | 0.029456 | 2.945619 | 0.250839 | 0.111857 | 0.352854 | 1.440104 | 0.255485 | 17.276152 | false | false | 2024-10-07 | 2024-10-08 | 2 | Qwen/Qwen2.5-1.5B |
GreenNode_GreenNode-small-9B-it_float16 | float16 | 🟩 continuously pretrained | 🟩 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/GreenNode/GreenNode-small-9B-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">GreenNode/GreenNode-small-9B-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/GreenNode__GreenNode-small-9B-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | GreenNode/GreenNode-small-9B-it | 1ba4ce8e2267c7fcc820961a9bfc13ab80150866 | 28.286651 | 0 | 9 | false | false | false | true | 2.645944 | 0.743613 | 74.36125 | 0.599384 | 41.899926 | 0 | 0 | 0.319631 | 9.284116 | 0.420417 | 11.652083 | 0.392703 | 32.522533 | false | false | 2024-10-14 | 0 | Removed |
||
GritLM_GritLM-7B-KTO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/GritLM/GritLM-7B-KTO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">GritLM/GritLM-7B-KTO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/GritLM__GritLM-7B-KTO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | GritLM/GritLM-7B-KTO | b5c48669508c1de18c698460c187f64e90e7df44 | 19.172954 | apache-2.0 | 4 | 7 | true | false | false | true | 0.639864 | 0.531013 | 53.101327 | 0.485294 | 27.904318 | 0.023414 | 2.34139 | 0.297819 | 6.375839 | 0.371021 | 6.644271 | 0.268035 | 18.670582 | false | false | 2024-04-16 | 2024-08-04 | 0 | GritLM/GritLM-7B-KTO |
GritLM_GritLM-8x7B-KTO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/GritLM/GritLM-8x7B-KTO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">GritLM/GritLM-8x7B-KTO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/GritLM__GritLM-8x7B-KTO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | GritLM/GritLM-8x7B-KTO | 938913477064fcc498757c5136d9899bb6e713ed | 25.838485 | apache-2.0 | 3 | 46 | true | false | false | true | 4.604463 | 0.571405 | 57.140498 | 0.58203 | 40.826162 | 0.098187 | 9.818731 | 0.296141 | 6.152125 | 0.421656 | 11.673698 | 0.364777 | 29.419696 | false | false | 2024-04-17 | 2024-08-04 | 0 | GritLM/GritLM-8x7B-KTO |
Gryphe_Pantheon-RP-1.0-8b-Llama-3_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Gryphe/Pantheon-RP-1.0-8b-Llama-3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Gryphe/Pantheon-RP-1.0-8b-Llama-3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Gryphe__Pantheon-RP-1.0-8b-Llama-3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Gryphe/Pantheon-RP-1.0-8b-Llama-3 | 70a6df202c9df9abdc6928bec5a5ab47f2667aee | 16.772417 | apache-2.0 | 46 | 8 | true | false | false | true | 0.720836 | 0.393252 | 39.325213 | 0.453908 | 23.631915 | 0.057402 | 5.740181 | 0.276007 | 3.467562 | 0.38324 | 5.504948 | 0.306682 | 22.964687 | false | false | 2024-05-08 | 2024-06-27 | 1 | meta-llama/Meta-Llama-3-8B |
Gryphe_Pantheon-RP-1.5-12b-Nemo_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/Gryphe/Pantheon-RP-1.5-12b-Nemo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Gryphe/Pantheon-RP-1.5-12b-Nemo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Gryphe__Pantheon-RP-1.5-12b-Nemo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Gryphe/Pantheon-RP-1.5-12b-Nemo | 00107381f05f69666772d88a1b11affe77c94a47 | 21.311159 | apache-2.0 | 29 | 12 | true | false | false | true | 1.685583 | 0.476308 | 47.630842 | 0.519582 | 31.750144 | 0.048338 | 4.833837 | 0.272651 | 3.020134 | 0.442031 | 15.053906 | 0.330203 | 25.578088 | false | false | 2024-07-25 | 2024-08-04 | 1 | mistralai/Mistral-Nemo-Base-2407 |
Gryphe_Pantheon-RP-1.6-12b-Nemo_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/Gryphe/Pantheon-RP-1.6-12b-Nemo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Gryphe/Pantheon-RP-1.6-12b-Nemo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Gryphe__Pantheon-RP-1.6-12b-Nemo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Gryphe/Pantheon-RP-1.6-12b-Nemo | 60cf38ae0367baf314e3cce748d9a199adfea557 | 20.365189 | apache-2.0 | 11 | 12 | true | false | false | true | 1.737253 | 0.448057 | 44.805671 | 0.520401 | 31.687344 | 0.033988 | 3.398792 | 0.277685 | 3.691275 | 0.42876 | 12.928385 | 0.331117 | 25.679669 | false | false | 2024-08-18 | 2024-08-31 | 1 | mistralai/Mistral-Nemo-Base-2407 |
Gryphe_Pantheon-RP-1.6-12b-Nemo-KTO_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/Gryphe/Pantheon-RP-1.6-12b-Nemo-KTO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Gryphe/Pantheon-RP-1.6-12b-Nemo-KTO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Gryphe__Pantheon-RP-1.6-12b-Nemo-KTO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Gryphe/Pantheon-RP-1.6-12b-Nemo-KTO | 6cb6d8d9a7352d71f539ab5053987e058c090443 | 21.407541 | apache-2.0 | 5 | 12 | true | false | false | true | 1.682026 | 0.463619 | 46.361875 | 0.527698 | 33.0322 | 0.043807 | 4.380665 | 0.295302 | 6.040268 | 0.424792 | 12.165625 | 0.338182 | 26.464613 | false | false | 2024-08-28 | 2024-08-31 | 1 | mistralai/Mistral-Nemo-Base-2407 |
Gryphe_Pantheon-RP-Pure-1.6.2-22b-Small_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/Gryphe/Pantheon-RP-Pure-1.6.2-22b-Small" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Gryphe/Pantheon-RP-Pure-1.6.2-22b-Small</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Gryphe__Pantheon-RP-Pure-1.6.2-22b-Small-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Gryphe/Pantheon-RP-Pure-1.6.2-22b-Small | d031830dcb3bc5ad9634374db4dd15b3ef6ebe0f | 27.823932 | other | 16 | 22 | true | false | false | true | 1.45332 | 0.693104 | 69.31043 | 0.530454 | 31.683163 | 0.183535 | 18.353474 | 0.328859 | 10.514541 | 0.376479 | 4.393229 | 0.394199 | 32.688756 | false | false | 2024-10-13 | 2024-10-15 | 1 | mistralai/Mistral-Small-Instruct-2409 |
Gunulhona_Gemma-Ko-Merge_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/Gunulhona/Gemma-Ko-Merge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Gunulhona/Gemma-Ko-Merge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Gunulhona__Gemma-Ko-Merge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Gunulhona/Gemma-Ko-Merge | ca6b0eb1405f21db6a7a9cce3b112d21fcfdde97 | 25.935394 | 0 | 10 | false | false | false | true | 3.137248 | 0.641572 | 64.157214 | 0.581303 | 38.787197 | 0.001511 | 0.151057 | 0.33557 | 11.409396 | 0.404698 | 9.120573 | 0.387882 | 31.986924 | false | false | 2024-09-04 | 2024-10-23 | 1 | Gunulhona/Gemma-Ko-Merge (Merge) |
|
Gunulhona_Gemma-Ko-Merge-PEFT_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Adapter | ? | <a target="_blank" href="https://huggingface.co/Gunulhona/Gemma-Ko-Merge-PEFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Gunulhona/Gemma-Ko-Merge-PEFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Gunulhona__Gemma-Ko-Merge-PEFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Gunulhona/Gemma-Ko-Merge-PEFT | ca6b0eb1405f21db6a7a9cce3b112d21fcfdde97 | 18.169495 | 0 | 20 | false | false | false | false | 5.876477 | 0.288039 | 28.803907 | 0.515409 | 30.186273 | 0 | 0 | 0.324664 | 9.955257 | 0.40801 | 8.767969 | 0.381732 | 31.303561 | false | false | 2024-09-30 | 2024-10-17 | 0 | Gunulhona/Gemma-Ko-Merge-PEFT |
|
Gunulhona_Gemma-Ko-Merge-PEFT_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Adapter | ? | <a target="_blank" href="https://huggingface.co/Gunulhona/Gemma-Ko-Merge-PEFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Gunulhona/Gemma-Ko-Merge-PEFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Gunulhona__Gemma-Ko-Merge-PEFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Gunulhona/Gemma-Ko-Merge-PEFT | ca6b0eb1405f21db6a7a9cce3b112d21fcfdde97 | 18.06624 | 0 | 20 | false | false | false | true | 9.394334 | 0.444135 | 44.41349 | 0.486299 | 26.015069 | 0 | 0 | 0.307047 | 7.606264 | 0.398583 | 7.05625 | 0.309757 | 23.306368 | false | false | 2024-09-30 | 2024-10-23 | 0 | Gunulhona/Gemma-Ko-Merge-PEFT |
|
HPAI-BSC_Llama3-Aloe-8B-Alpha_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/HPAI-BSC/Llama3-Aloe-8B-Alpha" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HPAI-BSC/Llama3-Aloe-8B-Alpha</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HPAI-BSC__Llama3-Aloe-8B-Alpha-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | HPAI-BSC/Llama3-Aloe-8B-Alpha | f0bce5c1fee5ea2a6679bb3dc9de8548e7262c9e | 20.104566 | cc-by-nc-4.0 | 53 | 8 | true | false | false | true | 0.795245 | 0.508107 | 50.810738 | 0.483085 | 27.145978 | 0.053625 | 5.362538 | 0.294463 | 5.928412 | 0.367271 | 5.875521 | 0.329538 | 25.504211 | false | false | 2024-04-26 | 2024-10-29 | 0 | HPAI-BSC/Llama3-Aloe-8B-Alpha |
HPAI-BSC_Llama3.1-Aloe-Beta-8B_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/HPAI-BSC/Llama3.1-Aloe-Beta-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HPAI-BSC/Llama3.1-Aloe-Beta-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HPAI-BSC__Llama3.1-Aloe-Beta-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | HPAI-BSC/Llama3.1-Aloe-Beta-8B | 3f2f0bbfb03cb0a8310efa50659688c1f2c02da0 | 23.754809 | llama3.1 | 10 | 8 | true | false | false | true | 1.398697 | 0.725328 | 72.532769 | 0.509276 | 30.369625 | 0.016616 | 1.661631 | 0.268456 | 2.46085 | 0.383458 | 6.832292 | 0.358045 | 28.67169 | false | false | 2024-10-30 | 2024-11-07 | 0 | HPAI-BSC/Llama3.1-Aloe-Beta-8B |
Hastagaras_Llama-3.1-Jamet-8B-MK.I_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Hastagaras/Llama-3.1-Jamet-8B-MK.I" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Hastagaras/Llama-3.1-Jamet-8B-MK.I</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Hastagaras__Llama-3.1-Jamet-8B-MK.I-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Hastagaras/Llama-3.1-Jamet-8B-MK.I | 26cb97042b04fee7d0140375a7babbf92278f8ac | 25.39863 | llama3.1 | 1 | 8 | true | false | false | true | 0.71874 | 0.733821 | 73.382071 | 0.504867 | 29.503905 | 0.125378 | 12.537764 | 0.274329 | 3.243848 | 0.372604 | 6.142188 | 0.348238 | 27.582004 | false | false | 2024-11-18 | 2024-11-18 | 0 | Hastagaras/Llama-3.1-Jamet-8B-MK.I |
Hastagaras_Zabuza-8B-Llama-3.1_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Hastagaras/Zabuza-8B-Llama-3.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Hastagaras/Zabuza-8B-Llama-3.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Hastagaras__Zabuza-8B-Llama-3.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Hastagaras/Zabuza-8B-Llama-3.1 | 57ffa92f229b8308916aae1d64d8f0dc9baa0a34 | 19.711829 | llama3.1 | 0 | 8 | true | false | false | true | 0.675287 | 0.626534 | 62.653426 | 0.453892 | 23.220321 | 0.042296 | 4.229607 | 0.264262 | 1.901566 | 0.356792 | 4.898958 | 0.292304 | 21.367095 | true | false | 2024-11-05 | 2024-11-05 | 1 | Hastagaras/Zabuza-8B-Llama-3.1 (Merge) |
HiroseKoichi_Llama-Salad-4x8B-V3_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/HiroseKoichi/Llama-Salad-4x8B-V3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HiroseKoichi/Llama-Salad-4x8B-V3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HiroseKoichi__Llama-Salad-4x8B-V3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | HiroseKoichi/Llama-Salad-4x8B-V3 | a343915429779efbd1478f01ba1f7fd9d8d226c0 | 24.93529 | llama3 | 5 | 24 | true | true | false | true | 2.137695 | 0.665352 | 66.535238 | 0.524465 | 31.928849 | 0.096677 | 9.667674 | 0.302852 | 7.04698 | 0.374031 | 6.453906 | 0.351812 | 27.979093 | true | false | 2024-06-17 | 2024-06-26 | 0 | HiroseKoichi/Llama-Salad-4x8B-V3 |
HuggingFaceH4_zephyr-7b-alpha_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/HuggingFaceH4/zephyr-7b-alpha" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceH4/zephyr-7b-alpha</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceH4__zephyr-7b-alpha-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | HuggingFaceH4/zephyr-7b-alpha | 2ce2d025864af849b3e5029e2ec9d568eeda892d | 18.571864 | mit | 1,102 | 7 | true | false | false | true | 0.795675 | 0.519148 | 51.914808 | 0.458786 | 23.955291 | 0.017372 | 1.73716 | 0.297819 | 6.375839 | 0.394958 | 7.503125 | 0.279505 | 19.944962 | false | true | 2023-10-09 | 2024-06-12 | 1 | mistralai/Mistral-7B-v0.1 |
HuggingFaceH4_zephyr-7b-beta_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/HuggingFaceH4/zephyr-7b-beta" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceH4/zephyr-7b-beta</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceH4__zephyr-7b-beta-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | HuggingFaceH4/zephyr-7b-beta | b70e0c9a2d9e14bd1e812d3c398e5f313e93b473 | 17.767061 | mit | 1,617 | 7 | true | false | false | true | 0.555023 | 0.495043 | 49.504315 | 0.431582 | 21.487542 | 0.02719 | 2.719033 | 0.290268 | 5.369128 | 0.392542 | 7.734375 | 0.278092 | 19.787973 | false | true | 2023-10-26 | 2024-06-12 | 1 | mistralai/Mistral-7B-v0.1 |
HuggingFaceH4_zephyr-7b-gemma-v0.1_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | GemmaForCausalLM | <a target="_blank" href="https://huggingface.co/HuggingFaceH4/zephyr-7b-gemma-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceH4/zephyr-7b-gemma-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceH4__zephyr-7b-gemma-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | HuggingFaceH4/zephyr-7b-gemma-v0.1 | 03b3427d0ed07d2e0f86c0a7e53d82d4beef9540 | 15.929338 | other | 121 | 8 | true | false | false | true | 1.481775 | 0.336374 | 33.637415 | 0.462374 | 23.751163 | 0.075529 | 7.55287 | 0.294463 | 5.928412 | 0.373969 | 4.179427 | 0.284741 | 20.526743 | false | true | 2024-03-01 | 2024-06-12 | 2 | google/gemma-7b |
HuggingFaceH4_zephyr-orpo-141b-A35b-v0.1_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/HuggingFaceH4/zephyr-orpo-141b-A35b-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceH4/zephyr-orpo-141b-A35b-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceH4__zephyr-orpo-141b-A35b-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | HuggingFaceH4/zephyr-orpo-141b-A35b-v0.1 | a3be084543d278e61b64cd600f28157afc79ffd3 | 34.063023 | apache-2.0 | 261 | 140 | true | false | false | true | 42.067786 | 0.651089 | 65.108911 | 0.629044 | 47.503796 | 0.200906 | 20.090634 | 0.378356 | 17.114094 | 0.446521 | 14.715104 | 0.45861 | 39.845597 | false | true | 2024-04-10 | 2024-06-12 | 1 | mistral-community/Mixtral-8x22B-v0.1 |
HuggingFaceTB_SmolLM-1.7B_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM-1.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM-1.7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM-1.7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | HuggingFaceTB/SmolLM-1.7B | 673a07602ca1191e5bc2ddda428e2f608a0a14c0 | 5.425399 | apache-2.0 | 164 | 1 | true | false | false | false | 0.324307 | 0.236157 | 23.615673 | 0.318052 | 4.411128 | 0.007553 | 0.755287 | 0.241611 | 0 | 0.342094 | 2.128385 | 0.114777 | 1.641918 | false | true | 2024-07-14 | 2024-07-18 | 0 | HuggingFaceTB/SmolLM-1.7B |
HuggingFaceTB_SmolLM-1.7B-Instruct_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM-1.7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM-1.7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM-1.7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | HuggingFaceTB/SmolLM-1.7B-Instruct | 0ad161e59935a9a691dfde2818df8b98786f30a7 | 5.138222 | apache-2.0 | 107 | 1 | true | false | false | true | 0.317023 | 0.234783 | 23.47826 | 0.288511 | 2.080374 | 0 | 0 | 0.260067 | 1.342282 | 0.348667 | 2.083333 | 0.116606 | 1.84508 | false | true | 2024-07-15 | 2024-07-18 | 1 | HuggingFaceTB/SmolLM-1.7B |
HuggingFaceTB_SmolLM-135M_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM-135M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM-135M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM-135M-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | HuggingFaceTB/SmolLM-135M | eec6e461571fba3e197a57c298f60b75422eae02 | 6.838197 | apache-2.0 | 180 | 0 | true | false | false | false | 0.343378 | 0.212476 | 21.247623 | 0.304605 | 3.2854 | 0.006798 | 0.679758 | 0.258389 | 1.118568 | 0.436604 | 13.342188 | 0.112201 | 1.355644 | false | true | 2024-07-14 | 2024-07-18 | 0 | HuggingFaceTB/SmolLM-135M |
HuggingFaceTB_SmolLM-135M-Instruct_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM-135M-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM-135M-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM-135M-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | HuggingFaceTB/SmolLM-135M-Instruct | 8ca7af58e27777cae460ad8ca3ab9db15f5c160d | 3.564171 | apache-2.0 | 98 | 0 | true | false | false | true | 0.467805 | 0.121401 | 12.140122 | 0.301508 | 2.692958 | 0 | 0 | 0.259228 | 1.230425 | 0.363458 | 3.365625 | 0.117603 | 1.955895 | false | true | 2024-07-15 | 2024-10-12 | 1 | HuggingFaceTB/SmolLM-135M |
HuggingFaceTB_SmolLM-360M_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM-360M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM-360M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM-360M-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | HuggingFaceTB/SmolLM-360M | 318cc630b73730bfd712e5873063156ffb8936b5 | 6.147596 | apache-2.0 | 62 | 0 | true | false | false | false | 0.36526 | 0.213351 | 21.335058 | 0.306452 | 3.284915 | 0.004532 | 0.453172 | 0.267617 | 2.348993 | 0.401781 | 8.089323 | 0.112367 | 1.374113 | false | true | 2024-07-14 | 2024-07-18 | 0 | HuggingFaceTB/SmolLM-360M |
HuggingFaceTB_SmolLM-360M-Instruct_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM-360M-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM-360M-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM-360M-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | HuggingFaceTB/SmolLM-360M-Instruct | 8e951de8c220295ea4f85d078c4e320df7137535 | 4.706784 | apache-2.0 | 77 | 0 | true | false | false | true | 0.366501 | 0.195165 | 19.516549 | 0.288511 | 2.080374 | 0 | 0 | 0.264262 | 1.901566 | 0.347177 | 2.897135 | 0.116606 | 1.84508 | false | true | 2024-07-15 | 2024-08-20 | 1 | HuggingFaceTB/SmolLM-360M |
HuggingFaceTB_SmolLM2-1.7B_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM2-1.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM2-1.7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM2-1.7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | HuggingFaceTB/SmolLM2-1.7B | 4fa12cab4f5f53670b05125fb9d2873af587d231 | 9.495504 | apache-2.0 | 83 | 1 | true | false | false | false | 0.325026 | 0.244 | 24.400036 | 0.345259 | 9.301788 | 0.021148 | 2.114804 | 0.279362 | 3.914989 | 0.348542 | 4.601042 | 0.213763 | 12.640366 | false | true | 2024-10-30 | 2024-11-06 | 0 | HuggingFaceTB/SmolLM2-1.7B |
HuggingFaceTB_SmolLM2-1.7B-Instruct_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM2-1.7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM2-1.7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM2-1.7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | HuggingFaceTB/SmolLM2-1.7B-Instruct | d1bb90bcfbe0f211109880f4da18da66f229c4f6 | 14.745339 | apache-2.0 | 429 | 1 | true | false | false | true | 0.324961 | 0.536784 | 53.678351 | 0.359862 | 10.917989 | 0.041541 | 4.154079 | 0.279362 | 3.914989 | 0.342125 | 4.098958 | 0.205369 | 11.707668 | false | true | 2024-10-31 | 2024-11-06 | 1 | HuggingFaceTB/SmolLM2-1.7B-Instruct (Merge) |
HuggingFaceTB_SmolLM2-135M_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM2-135M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM2-135M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM2-135M-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | HuggingFaceTB/SmolLM2-135M | 28e66ca6931668447a3bac213f23d990ad3b0e2b | 5.557677 | apache-2.0 | 37 | 0 | true | false | false | false | 0.333905 | 0.1833 | 18.330031 | 0.304423 | 3.708078 | 0.002266 | 0.226586 | 0.248322 | 0 | 0.411177 | 10.030469 | 0.109458 | 1.050901 | false | true | 2024-10-31 | 2024-11-06 | 0 | HuggingFaceTB/SmolLM2-135M |
HuggingFaceTB_SmolLM2-135M-Instruct_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM2-135M-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM2-135M-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM2-135M-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | HuggingFaceTB/SmolLM2-135M-Instruct | 5a33ba103645800d7b3790c4448546c1b73efc71 | 6.467365 | apache-2.0 | 72 | 0 | true | false | false | true | 0.338376 | 0.288314 | 28.83139 | 0.312432 | 4.720808 | 0.003021 | 0.302115 | 0.235738 | 0 | 0.366219 | 3.677344 | 0.111453 | 1.272533 | false | true | 2024-10-31 | 2024-11-06 | 1 | HuggingFaceTB/SmolLM2-135M-Instruct (Merge) |
HuggingFaceTB_SmolLM2-135M-Instruct_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM2-135M-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM2-135M-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM2-135M-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | HuggingFaceTB/SmolLM2-135M-Instruct | 5a33ba103645800d7b3790c4448546c1b73efc71 | 2.992599 | apache-2.0 | 72 | 0 | true | false | false | false | 0.348754 | 0.059252 | 5.925167 | 0.313475 | 4.796276 | 0.001511 | 0.151057 | 0.23406 | 0 | 0.387146 | 6.059896 | 0.109209 | 1.023197 | false | true | 2024-10-31 | 2024-11-14 | 1 | HuggingFaceTB/SmolLM2-135M-Instruct (Merge) |
HuggingFaceTB_SmolLM2-360M_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM2-360M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM2-360M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM2-360M-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | HuggingFaceTB/SmolLM2-360M | 3ce05f63c246c44616da500b47b01f082f4d3bcc | 6.100225 | apache-2.0 | 27 | 0 | true | false | false | false | 0.386658 | 0.211452 | 21.145228 | 0.323348 | 5.543603 | 0.003021 | 0.302115 | 0.245805 | 0 | 0.395427 | 7.728385 | 0.116938 | 1.882018 | false | true | 2024-10-31 | 2024-11-06 | 0 | HuggingFaceTB/SmolLM2-360M |
HuggingFaceTB_SmolLM2-360M-Instruct_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM2-360M-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM2-360M-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM2-360M-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | HuggingFaceTB/SmolLM2-360M-Instruct | 4873f67095301d304753fae05bc09ec766634e50 | 3.10002 | apache-2.0 | 60 | 0 | true | false | false | false | 0.392382 | 0.083032 | 8.303191 | 0.30527 | 3.299047 | 0.008308 | 0.830816 | 0.265101 | 2.013423 | 0.342281 | 2.751823 | 0.112616 | 1.401817 | false | true | 2024-10-31 | 2024-11-14 | 0 | HuggingFaceTB/SmolLM2-360M-Instruct |
HuggingFaceTB_SmolLM2-360M-Instruct_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM2-360M-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM2-360M-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM2-360M-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | HuggingFaceTB/SmolLM2-360M-Instruct | 4873f67095301d304753fae05bc09ec766634e50 | 8.001097 | apache-2.0 | 60 | 0 | true | false | false | true | 0.375819 | 0.38416 | 38.415959 | 0.314351 | 4.173864 | 0.006798 | 0.679758 | 0.255034 | 0.671141 | 0.346125 | 2.765625 | 0.111702 | 1.300236 | false | true | 2024-10-31 | 2024-11-06 | 0 | HuggingFaceTB/SmolLM2-360M-Instruct |
HumanLLMs_Humanish-LLama3-8B-Instruct_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/HumanLLMs/Humanish-LLama3-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HumanLLMs/Humanish-LLama3-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HumanLLMs__Humanish-LLama3-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | HumanLLMs/Humanish-LLama3-8B-Instruct | 42f73ada2b7fb16f18a75404d72b7911bf1e65ce | 22.564911 | llama3 | 2 | 8 | true | false | false | true | 0.748278 | 0.64979 | 64.979033 | 0.496771 | 28.012477 | 0.095921 | 9.592145 | 0.255872 | 0.782998 | 0.358156 | 2.002865 | 0.37018 | 30.019947 | false | false | 2024-10-04 | 2024-10-05 | 1 | meta-llama/Meta-Llama-3-8B-Instruct |
HumanLLMs_Humanish-Mistral-Nemo-Instruct-2407_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/HumanLLMs/Humanish-Mistral-Nemo-Instruct-2407" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HumanLLMs/Humanish-Mistral-Nemo-Instruct-2407</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HumanLLMs__Humanish-Mistral-Nemo-Instruct-2407-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | HumanLLMs/Humanish-Mistral-Nemo-Instruct-2407 | 45b80bdce8d447ef494af06751904afcc607eb37 | 23.0069 | apache-2.0 | 3 | 12 | true | false | false | true | 1.620283 | 0.545127 | 54.512693 | 0.526178 | 32.709613 | 0.083837 | 8.383686 | 0.287752 | 5.033557 | 0.39676 | 9.395052 | 0.352061 | 28.006797 | false | false | 2024-10-06 | 2024-10-06 | 2 | mistralai/Mistral-Nemo-Base-2407 |
HumanLLMs_Humanish-Qwen2.5-7B-Instruct_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/HumanLLMs/Humanish-Qwen2.5-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HumanLLMs/Humanish-Qwen2.5-7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HumanLLMs__Humanish-Qwen2.5-7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | HumanLLMs/Humanish-Qwen2.5-7B-Instruct | 7d2c71d926832d6e257ad2776011494dbac2d151 | 26.665374 | apache-2.0 | 3 | 7 | true | false | false | true | 1.193393 | 0.728425 | 72.842502 | 0.536368 | 34.478998 | 0 | 0 | 0.298658 | 6.487696 | 0.398063 | 8.424479 | 0.439827 | 37.75857 | false | false | 2024-10-05 | 2024-10-05 | 2 | Qwen/Qwen2.5-7B |
IDEA-CCNL_Ziya-LLaMA-13B-v1_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/IDEA-CCNL/Ziya-LLaMA-13B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">IDEA-CCNL/Ziya-LLaMA-13B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/IDEA-CCNL__Ziya-LLaMA-13B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | IDEA-CCNL/Ziya-LLaMA-13B-v1 | 64d931f346e1a49ea3bbca07a83137075bab1c66 | 3.906425 | gpl-3.0 | 273 | 13 | true | false | false | false | 1.108257 | 0.169686 | 16.968643 | 0.287703 | 1.463617 | 0 | 0 | 0.249161 | 0 | 0.375052 | 3.88151 | 0.110123 | 1.124778 | false | true | 2023-05-16 | 2024-06-12 | 0 | IDEA-CCNL/Ziya-LLaMA-13B-v1 |
Infinirc_Infinirc-Llama3-8B-2G-Release-v1.0_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Infinirc/Infinirc-Llama3-8B-2G-Release-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Infinirc/Infinirc-Llama3-8B-2G-Release-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Infinirc__Infinirc-Llama3-8B-2G-Release-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Infinirc/Infinirc-Llama3-8B-2G-Release-v1.0 | 9c542d9ec3f86e145ae445c200c6ebe9066e8cd6 | 13.087133 | llama3 | 1 | 8 | true | false | false | false | 1.818723 | 0.202434 | 20.243399 | 0.435074 | 20.831165 | 0.012085 | 1.208459 | 0.299497 | 6.599553 | 0.460938 | 16.750521 | 0.216007 | 12.889702 | false | false | 2024-06-26 | 2024-09-29 | 0 | Infinirc/Infinirc-Llama3-8B-2G-Release-v1.0 |
Intel_neural-chat-7b-v3_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/Intel/neural-chat-7b-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Intel/neural-chat-7b-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Intel__neural-chat-7b-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Intel/neural-chat-7b-v3 | fc679274dfcd28a8b6087634f71af7ed2a0659c4 | 17.943646 | apache-2.0 | 67 | 7 | true | false | false | false | 0.48929 | 0.277797 | 27.779736 | 0.504832 | 30.205692 | 0.021903 | 2.190332 | 0.291946 | 5.592841 | 0.50549 | 23.019531 | 0.269864 | 18.873744 | false | true | 2023-10-25 | 2024-06-12 | 1 | mistralai/Mistral-7B-v0.1 |
Intel_neural-chat-7b-v3-1_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/Intel/neural-chat-7b-v3-1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Intel/neural-chat-7b-v3-1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Intel__neural-chat-7b-v3-1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Intel/neural-chat-7b-v3-1 | c0d379a49c1c0579529d5e6f2e936ddb759552a8 | 21.004986 | apache-2.0 | 545 | 7 | true | false | false | false | 0.563692 | 0.46869 | 46.868974 | 0.505157 | 29.739752 | 0.031722 | 3.172205 | 0.290268 | 5.369128 | 0.497896 | 22.236979 | 0.267786 | 18.642878 | false | true | 2023-11-14 | 2024-06-12 | 1 | mistralai/Mistral-7B-v0.1 |
Intel_neural-chat-7b-v3-2_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/Intel/neural-chat-7b-v3-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Intel/neural-chat-7b-v3-2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Intel__neural-chat-7b-v3-2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Intel/neural-chat-7b-v3-2 | 0d8f77647810d21d935ea90c66d6339b85e65a75 | 21.433647 | apache-2.0 | 56 | 7 | true | false | false | false | 0.560441 | 0.49884 | 49.883975 | 0.503223 | 30.237458 | 0.045317 | 4.531722 | 0.290268 | 5.369128 | 0.489521 | 20.056771 | 0.266705 | 18.522828 | false | true | 2023-11-21 | 2024-06-12 | 0 | Intel/neural-chat-7b-v3-2 |
Intel_neural-chat-7b-v3-3_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/Intel/neural-chat-7b-v3-3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Intel/neural-chat-7b-v3-3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Intel__neural-chat-7b-v3-3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Intel/neural-chat-7b-v3-3 | bdd31cf498d13782cc7497cba5896996ce429f91 | 19.99112 | apache-2.0 | 75 | 7 | true | false | false | false | 0.559524 | 0.476259 | 47.625855 | 0.487662 | 27.753851 | 0.006798 | 0.679758 | 0.28943 | 5.257271 | 0.485958 | 20.578125 | 0.262467 | 18.051862 | false | true | 2023-12-09 | 2024-06-12 | 2 | mistralai/Mistral-7B-v0.1 |
IntervitensInc_internlm2_5-20b-llamafied_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/IntervitensInc/internlm2_5-20b-llamafied" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">IntervitensInc/internlm2_5-20b-llamafied</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/IntervitensInc__internlm2_5-20b-llamafied-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | IntervitensInc/internlm2_5-20b-llamafied | 0b6fc3cc0b9bf3529816061eb508483c20b77fe9 | 29.204293 | apache-2.0 | 2 | 19 | true | false | false | false | 1.381128 | 0.340995 | 34.099523 | 0.747847 | 63.47058 | 0.170695 | 17.069486 | 0.338087 | 11.744966 | 0.447542 | 14.942708 | 0.405086 | 33.898493 | false | false | 2024-08-06 | 2024-11-11 | 0 | IntervitensInc/internlm2_5-20b-llamafied |
Isaak-Carter_JOSIEv4o-8b-stage1-v4_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Isaak-Carter/JOSIEv4o-8b-stage1-v4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Isaak-Carter/JOSIEv4o-8b-stage1-v4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Isaak-Carter__JOSIEv4o-8b-stage1-v4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Isaak-Carter/JOSIEv4o-8b-stage1-v4 | a8380a7be51b547761824e524b3d95ac73203122 | 15.567377 | apache-2.0 | 1 | 8 | true | false | false | false | 0.890582 | 0.255266 | 25.526603 | 0.472497 | 25.787276 | 0.046828 | 4.682779 | 0.291946 | 5.592841 | 0.365438 | 6.079687 | 0.331616 | 25.735077 | false | false | 2024-08-03 | 2024-08-03 | 0 | Isaak-Carter/JOSIEv4o-8b-stage1-v4 |
Isaak-Carter_JOSIEv4o-8b-stage1-v4_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Isaak-Carter/JOSIEv4o-8b-stage1-v4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Isaak-Carter/JOSIEv4o-8b-stage1-v4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Isaak-Carter__JOSIEv4o-8b-stage1-v4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Isaak-Carter/JOSIEv4o-8b-stage1-v4 | a8380a7be51b547761824e524b3d95ac73203122 | 15.419272 | apache-2.0 | 1 | 8 | true | false | false | false | 0.879882 | 0.247697 | 24.769722 | 0.475807 | 25.919578 | 0.045317 | 4.531722 | 0.291107 | 5.480984 | 0.364104 | 6.346354 | 0.329205 | 25.467272 | false | false | 2024-08-03 | 2024-08-03 | 0 | Isaak-Carter/JOSIEv4o-8b-stage1-v4 |
Isaak-Carter_Josiefied-Qwen2.5-7B-Instruct-abliterated_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/Isaak-Carter/Josiefied-Qwen2.5-7B-Instruct-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Isaak-Carter/Josiefied-Qwen2.5-7B-Instruct-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Isaak-Carter__Josiefied-Qwen2.5-7B-Instruct-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Isaak-Carter/Josiefied-Qwen2.5-7B-Instruct-abliterated | 879168f9ce9fac315a19dd4f4c7df5253bb660f2 | 26.857295 | 0 | 7 | false | false | false | true | 1.076791 | 0.731747 | 73.174732 | 0.539638 | 34.904316 | 0 | 0 | 0.302852 | 7.04698 | 0.408667 | 9.616667 | 0.42761 | 36.401079 | false | false | 2024-09-21 | 0 | Removed |
||
Isaak-Carter_Josiefied-Qwen2.5-7B-Instruct-abliterated-v2_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/Isaak-Carter/Josiefied-Qwen2.5-7B-Instruct-abliterated-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Isaak-Carter/Josiefied-Qwen2.5-7B-Instruct-abliterated-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Isaak-Carter__Josiefied-Qwen2.5-7B-Instruct-abliterated-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Isaak-Carter/Josiefied-Qwen2.5-7B-Instruct-abliterated-v2 | 5d07f58562422feb9f25c9c048e40356d2cf7e4b | 27.81796 | apache-2.0 | 4 | 7 | true | false | false | true | 1.130915 | 0.784104 | 78.410396 | 0.531092 | 33.29454 | 0 | 0 | 0.298658 | 6.487696 | 0.435396 | 13.957813 | 0.412816 | 34.757314 | false | false | 2024-09-20 | 2024-09-21 | 1 | Qwen/Qwen2.5-7B |
J-LAB_Thynk_orpo_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/J-LAB/Thynk_orpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">J-LAB/Thynk_orpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/J-LAB__Thynk_orpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | J-LAB/Thynk_orpo | c6606d402f26d005b9f1a71a1cde9139d1cffb2a | 16.974407 | 0 | 3 | false | false | false | false | 1.214764 | 0.210178 | 21.017788 | 0.446311 | 22.062784 | 0.130665 | 13.066465 | 0.292785 | 5.704698 | 0.451479 | 15.201563 | 0.323138 | 24.793144 | false | false | 2024-10-14 | 0 | Removed |
||
JackFram_llama-160m_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/JackFram/llama-160m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JackFram/llama-160m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JackFram__llama-160m-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | JackFram/llama-160m | aca9b687d1425f863dcf5de9a4c96e3fe36266dd | 4.599661 | apache-2.0 | 34 | 0 | true | false | false | false | 0.093474 | 0.179104 | 17.910367 | 0.288802 | 2.033606 | 0 | 0 | 0.261745 | 1.565996 | 0.379208 | 4.667708 | 0.112783 | 1.420287 | false | false | 2023-05-26 | 2024-11-30 | 0 | JackFram/llama-160m |
JackFram_llama-68m_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/JackFram/llama-68m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JackFram/llama-68m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JackFram__llama-68m-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | JackFram/llama-68m | 964a5d77df908b69f8d6476fb70e940425b04cb5 | 4.862635 | apache-2.0 | 25 | 0 | true | false | false | false | 0.060558 | 0.172634 | 17.263417 | 0.29363 | 2.591048 | 0 | 0 | 0.258389 | 1.118568 | 0.39099 | 6.607031 | 0.114362 | 1.595745 | false | false | 2023-07-19 | 2024-11-30 | 0 | JackFram/llama-68m |
Jacoby746_Casual-Magnum-34B_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Jacoby746/Casual-Magnum-34B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jacoby746/Casual-Magnum-34B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jacoby746__Casual-Magnum-34B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Jacoby746/Casual-Magnum-34B | b628c6959441db75460cfd49536322b1ea46130e | 23.571335 | apache-2.0 | 1 | 34 | true | false | false | false | 3.426697 | 0.193017 | 19.301675 | 0.603205 | 43.051568 | 0.07855 | 7.854985 | 0.372483 | 16.331096 | 0.40776 | 8.403385 | 0.518368 | 46.485298 | true | false | 2024-10-01 | 2024-10-23 | 1 | Jacoby746/Casual-Magnum-34B (Merge) |
Jacoby746_Inf-Silent-Kunoichi-v0.1-2x7B_float16 | float16 | 🤝 base merges and moerges | 🤝 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/Jacoby746/Inf-Silent-Kunoichi-v0.1-2x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jacoby746/Inf-Silent-Kunoichi-v0.1-2x7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jacoby746__Inf-Silent-Kunoichi-v0.1-2x7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Jacoby746/Inf-Silent-Kunoichi-v0.1-2x7B | 9ab68beb6fe16cab2ab708b9af4417c89751d297 | 20.009948 | apache-2.0 | 0 | 12 | true | false | false | false | 1.860053 | 0.387982 | 38.798167 | 0.518546 | 32.387004 | 0.060423 | 6.042296 | 0.28943 | 5.257271 | 0.428042 | 12.338542 | 0.327128 | 25.236407 | false | false | 2024-09-19 | 2024-09-20 | 1 | Jacoby746/Inf-Silent-Kunoichi-v0.1-2x7B (Merge) |
Jacoby746_Inf-Silent-Kunoichi-v0.2-2x7B_float16 | float16 | 🤝 base merges and moerges | 🤝 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/Jacoby746/Inf-Silent-Kunoichi-v0.2-2x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jacoby746/Inf-Silent-Kunoichi-v0.2-2x7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jacoby746__Inf-Silent-Kunoichi-v0.2-2x7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Jacoby746/Inf-Silent-Kunoichi-v0.2-2x7B | 711263c24f812676eb382a31a5f0fed9bd8c16e4 | 19.917523 | apache-2.0 | 0 | 12 | true | false | false | false | 0.866265 | 0.363602 | 36.360191 | 0.520942 | 32.259184 | 0.056647 | 5.664653 | 0.300336 | 6.711409 | 0.431979 | 13.264062 | 0.327211 | 25.245641 | false | false | 2024-09-19 | 2024-09-21 | 1 | Jacoby746/Inf-Silent-Kunoichi-v0.2-2x7B (Merge) |
Jacoby746_Proto-Athena-4x7B_float16 | float16 | 🤝 base merges and moerges | 🤝 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/Jacoby746/Proto-Athena-4x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jacoby746/Proto-Athena-4x7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jacoby746__Proto-Athena-4x7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Jacoby746/Proto-Athena-4x7B | 450fcba7a630fb61a662f71936d37979226fced8 | 19.649696 | apache-2.0 | 0 | 24 | true | false | false | false | 1.676614 | 0.370296 | 37.029637 | 0.510655 | 30.870823 | 0.057402 | 5.740181 | 0.294463 | 5.928412 | 0.434771 | 13.813021 | 0.320645 | 24.516105 | false | false | 2024-09-21 | 2024-09-21 | 1 | Jacoby746/Proto-Athena-4x7B (Merge) |
Jacoby746_Proto-Athena-v0.2-4x7B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/Jacoby746/Proto-Athena-v0.2-4x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jacoby746/Proto-Athena-v0.2-4x7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jacoby746__Proto-Athena-v0.2-4x7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Jacoby746/Proto-Athena-v0.2-4x7B | 01feeded217ea83a8794e7968c8850859b5f0b14 | 19.143898 | apache-2.0 | 0 | 24 | true | false | false | false | 1.651372 | 0.375242 | 37.524214 | 0.506773 | 30.340844 | 0.05136 | 5.135952 | 0.298658 | 6.487696 | 0.421281 | 10.960156 | 0.319731 | 24.414524 | false | false | 2024-09-21 | 2024-09-21 | 1 | Jacoby746/Proto-Athena-v0.2-4x7B (Merge) |
Jacoby746_Proto-Harpy-Blazing-Light-v0.1-2x7B_float16 | float16 | 🤝 base merges and moerges | 🤝 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/Jacoby746/Proto-Harpy-Blazing-Light-v0.1-2x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jacoby746/Proto-Harpy-Blazing-Light-v0.1-2x7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jacoby746__Proto-Harpy-Blazing-Light-v0.1-2x7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Jacoby746/Proto-Harpy-Blazing-Light-v0.1-2x7B | bbb5d7c7a0c9e999e057ffa71eaa93d59d95b36b | 22.292392 | 0 | 12 | false | false | false | false | 0.881841 | 0.490472 | 49.047195 | 0.518685 | 32.63253 | 0.063444 | 6.344411 | 0.295302 | 6.040268 | 0.444969 | 14.121094 | 0.33012 | 25.568853 | false | false | 2024-09-22 | 2024-09-30 | 1 | Jacoby746/Proto-Harpy-Blazing-Light-v0.1-2x7B (Merge) |
|
Jacoby746_Proto-Harpy-Spark-v0.1-7B_float16 | float16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/Jacoby746/Proto-Harpy-Spark-v0.1-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jacoby746/Proto-Harpy-Spark-v0.1-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jacoby746__Proto-Harpy-Spark-v0.1-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Jacoby746/Proto-Harpy-Spark-v0.1-7B | 984cca02cd930b2e1b7b2a7d53471d32d9821cdd | 19.862588 | apache-2.0 | 0 | 7 | true | false | false | false | 0.595805 | 0.433269 | 43.326928 | 0.473577 | 26.91311 | 0.062689 | 6.268882 | 0.305369 | 7.38255 | 0.431667 | 12.291667 | 0.306932 | 22.992391 | true | false | 2024-09-22 | 2024-09-30 | 1 | Jacoby746/Proto-Harpy-Spark-v0.1-7B (Merge) |
Jimmy19991222_Llama-3-Instruct-8B-SimPO-v0.2_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Jimmy19991222/Llama-3-Instruct-8B-SimPO-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jimmy19991222/Llama-3-Instruct-8B-SimPO-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jimmy19991222__Llama-3-Instruct-8B-SimPO-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Jimmy19991222/Llama-3-Instruct-8B-SimPO-v0.2 | 53a517ceaef324efc3626be44140b4f18a010591 | 24.279948 | 0 | 8 | false | false | false | true | 0.513152 | 0.654037 | 65.403684 | 0.498371 | 29.123823 | 0.043051 | 4.305136 | 0.314597 | 8.612975 | 0.40125 | 8.389583 | 0.3686 | 29.844489 | false | false | 2024-09-06 | 0 | Removed |
||
Jimmy19991222_llama-3-8b-instruct-gapo-v2-bert-f1-beta10-gamma0.3-lr1.0e-6-1minus-rerun_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Jimmy19991222/llama-3-8b-instruct-gapo-v2-bert-f1-beta10-gamma0.3-lr1.0e-6-1minus-rerun" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jimmy19991222/llama-3-8b-instruct-gapo-v2-bert-f1-beta10-gamma0.3-lr1.0e-6-1minus-rerun</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jimmy19991222__llama-3-8b-instruct-gapo-v2-bert-f1-beta10-gamma0.3-lr1.0e-6-1minus-rerun-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Jimmy19991222/llama-3-8b-instruct-gapo-v2-bert-f1-beta10-gamma0.3-lr1.0e-6-1minus-rerun | 00c02a823b4ff1a6cfcded6085ba9630df633998 | 23.817704 | llama3 | 0 | 8 | true | false | false | true | 0.481791 | 0.671722 | 67.172214 | 0.48798 | 27.755229 | 0.040785 | 4.07855 | 0.294463 | 5.928412 | 0.404073 | 8.709115 | 0.363364 | 29.262707 | false | false | 2024-09-17 | 2024-09-18 | 1 | meta-llama/Meta-Llama-3-8B-Instruct |
Jimmy19991222_llama-3-8b-instruct-gapo-v2-bert_f1-beta10-gamma0.3-lr1.0e-6-scale-log_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Jimmy19991222/llama-3-8b-instruct-gapo-v2-bert_f1-beta10-gamma0.3-lr1.0e-6-scale-log" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jimmy19991222/llama-3-8b-instruct-gapo-v2-bert_f1-beta10-gamma0.3-lr1.0e-6-scale-log</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jimmy19991222__llama-3-8b-instruct-gapo-v2-bert_f1-beta10-gamma0.3-lr1.0e-6-scale-log-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Jimmy19991222/llama-3-8b-instruct-gapo-v2-bert_f1-beta10-gamma0.3-lr1.0e-6-scale-log | 99d9e31df5b7e88b1da78b1bd335cac3215dfd6e | 23.75627 | llama3 | 0 | 8 | true | false | false | true | 0.478535 | 0.655561 | 65.556058 | 0.493458 | 28.613597 | 0.033988 | 3.398792 | 0.30453 | 7.270694 | 0.40001 | 8.167969 | 0.365775 | 29.530511 | false | false | 2024-09-22 | 2024-09-22 | 1 | meta-llama/Meta-Llama-3-8B-Instruct |
Jimmy19991222_llama-3-8b-instruct-gapo-v2-bert_p-beta10-gamma0.3-lr1.0e-6-scale-log_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Jimmy19991222/llama-3-8b-instruct-gapo-v2-bert_p-beta10-gamma0.3-lr1.0e-6-scale-log" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jimmy19991222/llama-3-8b-instruct-gapo-v2-bert_p-beta10-gamma0.3-lr1.0e-6-scale-log</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jimmy19991222__llama-3-8b-instruct-gapo-v2-bert_p-beta10-gamma0.3-lr1.0e-6-scale-log-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Jimmy19991222/llama-3-8b-instruct-gapo-v2-bert_p-beta10-gamma0.3-lr1.0e-6-scale-log | 49a029ea2605d768e89b638ad78a59fd62d192ab | 22.797979 | llama3 | 0 | 8 | true | false | false | true | 0.522485 | 0.631506 | 63.150552 | 0.491641 | 27.666184 | 0.050604 | 5.060423 | 0.286074 | 4.809843 | 0.3935 | 7.0875 | 0.36112 | 29.013372 | false | false | 2024-09-22 | 2024-09-22 | 1 | meta-llama/Meta-Llama-3-8B-Instruct |
Jimmy19991222_llama-3-8b-instruct-gapo-v2-bleu-beta0.1-no-length-scale-gamma0.4_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Jimmy19991222/llama-3-8b-instruct-gapo-v2-bleu-beta0.1-no-length-scale-gamma0.4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jimmy19991222/llama-3-8b-instruct-gapo-v2-bleu-beta0.1-no-length-scale-gamma0.4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jimmy19991222__llama-3-8b-instruct-gapo-v2-bleu-beta0.1-no-length-scale-gamma0.4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Jimmy19991222/llama-3-8b-instruct-gapo-v2-bleu-beta0.1-no-length-scale-gamma0.4 | de8bb28ad7a9d1158f318a4461dc47ad03e6e560 | 22.827312 | 0 | 8 | false | false | false | true | 0.480371 | 0.628458 | 62.845805 | 0.498609 | 29.329732 | 0.017372 | 1.73716 | 0.292785 | 5.704698 | 0.401375 | 9.071875 | 0.354471 | 28.274601 | false | false | 2024-09-06 | 0 | Removed |
||
Jimmy19991222_llama-3-8b-instruct-gapo-v2-rouge2-beta10-1minus-gamma0.3-rerun_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Jimmy19991222/llama-3-8b-instruct-gapo-v2-rouge2-beta10-1minus-gamma0.3-rerun" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jimmy19991222/llama-3-8b-instruct-gapo-v2-rouge2-beta10-1minus-gamma0.3-rerun</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jimmy19991222__llama-3-8b-instruct-gapo-v2-rouge2-beta10-1minus-gamma0.3-rerun-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Jimmy19991222/llama-3-8b-instruct-gapo-v2-rouge2-beta10-1minus-gamma0.3-rerun | e9692d8dbe30273839763757aa9ef07a5fcf0c59 | 24.159026 | llama3 | 0 | 8 | true | false | false | true | 1.009359 | 0.66775 | 66.775046 | 0.494046 | 28.390676 | 0.047583 | 4.758308 | 0.306208 | 7.494407 | 0.398708 | 8.005208 | 0.365775 | 29.530511 | false | false | 2024-09-14 | 2024-09-15 | 1 | meta-llama/Meta-Llama-3-8B-Instruct |
Jimmy19991222_llama-3-8b-instruct-gapo-v2-rouge2-beta10-gamma0.3-lr1.0e-6-scale-log_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Jimmy19991222/llama-3-8b-instruct-gapo-v2-rouge2-beta10-gamma0.3-lr1.0e-6-scale-log" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jimmy19991222/llama-3-8b-instruct-gapo-v2-rouge2-beta10-gamma0.3-lr1.0e-6-scale-log</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jimmy19991222__llama-3-8b-instruct-gapo-v2-rouge2-beta10-gamma0.3-lr1.0e-6-scale-log-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Jimmy19991222/llama-3-8b-instruct-gapo-v2-rouge2-beta10-gamma0.3-lr1.0e-6-scale-log | 9ff0ce408abb8dbcf7efb9b6533338f2c344a355 | 23.858383 | llama3 | 0 | 8 | true | false | false | true | 0.501994 | 0.660506 | 66.050635 | 0.491601 | 28.075036 | 0.044562 | 4.456193 | 0.303691 | 7.158837 | 0.400042 | 7.805208 | 0.366439 | 29.604388 | false | false | 2024-09-22 | 2024-09-22 | 1 | meta-llama/Meta-Llama-3-8B-Instruct |
Jimmy19991222_llama-3-8b-instruct-gapo-v2-rougeL-beta10-gamma0.3-lr1.0e-6-scale-log_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Jimmy19991222/llama-3-8b-instruct-gapo-v2-rougeL-beta10-gamma0.3-lr1.0e-6-scale-log" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jimmy19991222/llama-3-8b-instruct-gapo-v2-rougeL-beta10-gamma0.3-lr1.0e-6-scale-log</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jimmy19991222__llama-3-8b-instruct-gapo-v2-rougeL-beta10-gamma0.3-lr1.0e-6-scale-log-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Jimmy19991222/llama-3-8b-instruct-gapo-v2-rougeL-beta10-gamma0.3-lr1.0e-6-scale-log | ec67f95c4d1813a34bbde52d0ad14824fd7111a0 | 23.742269 | llama3 | 0 | 8 | true | false | false | true | 0.486586 | 0.649191 | 64.919081 | 0.495249 | 28.562567 | 0.045317 | 4.531722 | 0.302013 | 6.935123 | 0.396135 | 7.383594 | 0.371094 | 30.121528 | false | false | 2024-09-22 | 2024-09-22 | 1 | meta-llama/Meta-Llama-3-8B-Instruct |
Joseph717171_Hermes-3-Llama-3.1-8B_TIES_with_Base_Embeds_Initialized_to_Special_Instruct_Toks_dtypeF32_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Joseph717171/Hermes-3-Llama-3.1-8B_TIES_with_Base_Embeds_Initialized_to_Special_Instruct_Toks_dtypeF32" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Joseph717171/Hermes-3-Llama-3.1-8B_TIES_with_Base_Embeds_Initialized_to_Special_Instruct_Toks_dtypeF32</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Joseph717171__Hermes-3-Llama-3.1-8B_TIES_with_Base_Embeds_Initialized_to_Special_Instruct_Toks_dtypeF32-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Joseph717171/Hermes-3-Llama-3.1-8B_TIES_with_Base_Embeds_Initialized_to_Special_Instruct_Toks_dtypeF32 | 823930851c57b11fd2e25cd65b5c53f909209d0e | 23.252877 | llama3.1 | 1 | 8 | true | false | false | true | 0.707545 | 0.618541 | 61.854103 | 0.517745 | 30.724097 | 0.05136 | 5.135952 | 0.282718 | 4.362416 | 0.436938 | 13.617187 | 0.314412 | 23.823508 | true | false | 2024-10-23 | 2024-10-25 | 0 | Joseph717171/Hermes-3-Llama-3.1-8B_TIES_with_Base_Embeds_Initialized_to_Special_Instruct_Toks_dtypeF32 |
Joseph717171_Llama-3.1-SuperNova-8B-Lite_TIES_with_Base_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Joseph717171/Llama-3.1-SuperNova-8B-Lite_TIES_with_Base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Joseph717171/Llama-3.1-SuperNova-8B-Lite_TIES_with_Base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Joseph717171__Llama-3.1-SuperNova-8B-Lite_TIES_with_Base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Joseph717171/Llama-3.1-SuperNova-8B-Lite_TIES_with_Base | f1e2cad4dca10f948fd2ee9588f80df0b40d7232 | 30.081383 | llama3.1 | 8 | 8 | true | false | false | true | 0.874731 | 0.809633 | 80.963289 | 0.514742 | 31.465813 | 0.173716 | 17.371601 | 0.309564 | 7.941834 | 0.41099 | 10.740365 | 0.388049 | 32.005393 | true | false | 2024-10-02 | 2024-10-03 | 0 | Joseph717171/Llama-3.1-SuperNova-8B-Lite_TIES_with_Base |
Josephgflowers_Cinder-Phi-2-V1-F16-gguf_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | PhiForCausalLM | <a target="_blank" href="https://huggingface.co/Josephgflowers/Cinder-Phi-2-V1-F16-gguf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Josephgflowers/Cinder-Phi-2-V1-F16-gguf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Josephgflowers__Cinder-Phi-2-V1-F16-gguf-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Josephgflowers/Cinder-Phi-2-V1-F16-gguf | 85629ec9b18efee31d07630664e7a3815121badf | 10.855703 | mit | 4 | 2 | true | false | false | true | 0.471404 | 0.235657 | 23.565695 | 0.439662 | 22.453402 | 0 | 0 | 0.281879 | 4.250559 | 0.343458 | 1.965625 | 0.21609 | 12.898936 | false | false | 2024-02-25 | 2024-06-26 | 0 | Josephgflowers/Cinder-Phi-2-V1-F16-gguf |
Josephgflowers_Differential-Attention-Liquid-Metal-Tinyllama_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Josephgflowers/Differential-Attention-Liquid-Metal-Tinyllama" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Josephgflowers/Differential-Attention-Liquid-Metal-Tinyllama</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Josephgflowers__Differential-Attention-Liquid-Metal-Tinyllama-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Josephgflowers/Differential-Attention-Liquid-Metal-Tinyllama | bdb6c63ff1025241e8e10b1858d67dc410f0a702 | 4.709671 | mit | 0 | 1 | true | false | false | true | 0.173794 | 0.222692 | 22.269246 | 0.292556 | 2.552224 | 0 | 0 | 0.250839 | 0.111857 | 0.335552 | 0.94401 | 0.121426 | 2.380689 | false | false | 2024-11-05 | 2024-11-07 | 0 | Josephgflowers/Differential-Attention-Liquid-Metal-Tinyllama |
Josephgflowers_TinyLlama-Cinder-Agent-v1_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Josephgflowers/TinyLlama-Cinder-Agent-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Josephgflowers/TinyLlama-Cinder-Agent-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Josephgflowers__TinyLlama-Cinder-Agent-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Josephgflowers/TinyLlama-Cinder-Agent-v1 | a9cd8b48bfe30f29bb1f819213da9a4c41eee67f | 5.816564 | mit | 1 | 1 | true | false | false | true | 0.237832 | 0.266956 | 26.695612 | 0.311604 | 3.804167 | 0.003776 | 0.377644 | 0.244128 | 0 | 0.339458 | 2.232292 | 0.116107 | 1.789672 | false | false | 2024-05-21 | 2024-06-26 | 4 | Josephgflowers/TinyLlama-3T-Cinder-v1.2 |
Josephgflowers_TinyLlama-v1.1-Cinders-World_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Josephgflowers/TinyLlama-v1.1-Cinders-World" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Josephgflowers/TinyLlama-v1.1-Cinders-World</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Josephgflowers__TinyLlama-v1.1-Cinders-World-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Josephgflowers/TinyLlama-v1.1-Cinders-World | 11a2c305f787a7908dd87c4e5a7d0f1e314a1f05 | 5.129125 | mit | 0 | 1 | true | false | false | true | 0.257383 | 0.246923 | 24.692261 | 0.299797 | 3.107714 | 0.001511 | 0.151057 | 0.244128 | 0 | 0.335615 | 0.61849 | 0.119847 | 2.20523 | false | false | 2024-10-12 | 2024-10-13 | 0 | Josephgflowers/TinyLlama-v1.1-Cinders-World |
Josephgflowers_TinyLlama_v1.1_math_code-world-test-1_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Josephgflowers/TinyLlama_v1.1_math_code-world-test-1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Josephgflowers/TinyLlama_v1.1_math_code-world-test-1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Josephgflowers__TinyLlama_v1.1_math_code-world-test-1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Josephgflowers/TinyLlama_v1.1_math_code-world-test-1 | 6f7c2aaf0b8723bc6a1dc23a4a1ff0ec24dc11ec | 1.839166 | mit | 0 | 1 | true | false | false | false | 0.272944 | 0.007844 | 0.784363 | 0.314635 | 4.164017 | 0.009819 | 0.981873 | 0.23406 | 0 | 0.349906 | 3.638281 | 0.113198 | 1.46646 | false | false | 2024-06-23 | 2024-09-09 | 0 | Josephgflowers/TinyLlama_v1.1_math_code-world-test-1 |
Josephgflowers_Tinyllama-STEM-Cinder-Agent-v1_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Josephgflowers/Tinyllama-STEM-Cinder-Agent-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Josephgflowers/Tinyllama-STEM-Cinder-Agent-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Josephgflowers__Tinyllama-STEM-Cinder-Agent-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Josephgflowers/Tinyllama-STEM-Cinder-Agent-v1 | c6880b94e72dddbe591fdf30fa15fe42ea60b924 | 4.575881 | mit | 0 | 1 | true | false | false | true | 0.163386 | 0.212576 | 21.257597 | 0.308438 | 3.731313 | 0.000755 | 0.075529 | 0.234899 | 0 | 0.334125 | 1.432292 | 0.108627 | 0.958555 | false | false | 2024-11-27 | 2024-11-27 | 0 | Josephgflowers/Tinyllama-STEM-Cinder-Agent-v1 |
Junhoee_Qwen-Megumin_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Adapter | ? | <a target="_blank" href="https://huggingface.co/Junhoee/Qwen-Megumin" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Junhoee/Qwen-Megumin</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Junhoee__Qwen-Megumin-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Junhoee/Qwen-Megumin | bb46c15ee4bb56c5b63245ef50fd7637234d6f75 | 25.822493 | 0 | 15 | false | false | false | true | 1.889396 | 0.714112 | 71.411189 | 0.528527 | 33.642144 | 0 | 0 | 0.296141 | 6.152125 | 0.398031 | 8.18724 | 0.41988 | 35.542258 | false | false | 2024-11-26 | 2024-11-26 | 2 | Qwen/Qwen2.5-7B |
|
KSU-HW-SEC_Llama3-70b-SVA-FT-1415_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/KSU-HW-SEC/Llama3-70b-SVA-FT-1415" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">KSU-HW-SEC/Llama3-70b-SVA-FT-1415</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/KSU-HW-SEC__Llama3-70b-SVA-FT-1415-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | KSU-HW-SEC/Llama3-70b-SVA-FT-1415 | 1c09728455567898116d2d9cfb6cbbbbd4ee730c | 36.119233 | 0 | 70 | false | false | false | false | 9.601029 | 0.617991 | 61.799137 | 0.665015 | 51.328741 | 0.219789 | 21.978852 | 0.375 | 16.666667 | 0.456542 | 17.801042 | 0.524269 | 47.140957 | false | false | 2024-09-08 | 0 | Removed |
||
KSU-HW-SEC_Llama3-70b-SVA-FT-500_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/KSU-HW-SEC/Llama3-70b-SVA-FT-500" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">KSU-HW-SEC/Llama3-70b-SVA-FT-500</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/KSU-HW-SEC__Llama3-70b-SVA-FT-500-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | KSU-HW-SEC/Llama3-70b-SVA-FT-500 | 856a23f28aeada23d1135c86a37e05524307e8ed | 35.953712 | 0 | 70 | false | false | false | false | 9.473738 | 0.610522 | 61.05223 | 0.669224 | 51.887026 | 0.213746 | 21.374622 | 0.380872 | 17.449664 | 0.451146 | 16.993229 | 0.522689 | 46.965499 | false | false | 2024-09-08 | 0 | Removed |
||
KSU-HW-SEC_Llama3-70b-SVA-FT-final_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/KSU-HW-SEC/Llama3-70b-SVA-FT-final" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">KSU-HW-SEC/Llama3-70b-SVA-FT-final</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/KSU-HW-SEC__Llama3-70b-SVA-FT-final-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | KSU-HW-SEC/Llama3-70b-SVA-FT-final | 391bbd94173b34975d1aa2c7356977a630253b75 | 36.093837 | 0 | 70 | false | false | false | false | 9.656199 | 0.616468 | 61.646764 | 0.665015 | 51.328741 | 0.219789 | 21.978852 | 0.375 | 16.666667 | 0.456542 | 17.801042 | 0.524269 | 47.140957 | false | false | 2024-09-08 | 0 | Removed |
||
KSU-HW-SEC_Llama3.1-70b-SVA-FT-1000step_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/KSU-HW-SEC/Llama3.1-70b-SVA-FT-1000step" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">KSU-HW-SEC/Llama3.1-70b-SVA-FT-1000step</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/KSU-HW-SEC__Llama3.1-70b-SVA-FT-1000step-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | KSU-HW-SEC/Llama3.1-70b-SVA-FT-1000step | b195fea0d8f350ff29243d4e88654b1baa5af79e | 40.750259 | 0 | 70 | false | false | false | false | 12.554447 | 0.723804 | 72.380395 | 0.690312 | 55.485365 | 0.320997 | 32.099698 | 0.395973 | 19.463087 | 0.459177 | 17.830469 | 0.525183 | 47.242538 | false | false | 2024-09-08 | 0 | Removed |
||
Kimargin_GPT-NEO-1.3B-wiki_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | GPTNeoForCausalLM | <a target="_blank" href="https://huggingface.co/Kimargin/GPT-NEO-1.3B-wiki" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Kimargin/GPT-NEO-1.3B-wiki</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Kimargin__GPT-NEO-1.3B-wiki-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Kimargin/GPT-NEO-1.3B-wiki | 92fa51fa6589f6e8fdfcc83f085216b3dae11da5 | 5.248478 | apache-2.0 | 1 | 1 | true | false | false | false | 0.832745 | 0.192068 | 19.206816 | 0.302634 | 3.423612 | 0.008308 | 0.830816 | 0.244966 | 0 | 0.38826 | 6.932552 | 0.109874 | 1.097074 | false | false | 2024-10-23 | 2024-10-24 | 1 | Kimargin/GPT-NEO-1.3B-wiki (Merge) |
KingNish_Qwen2.5-0.5b-Test-ft_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/KingNish/Qwen2.5-0.5b-Test-ft" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">KingNish/Qwen2.5-0.5b-Test-ft</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/KingNish__Qwen2.5-0.5b-Test-ft-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | KingNish/Qwen2.5-0.5b-Test-ft | f905bb1d37c7853fb5c7157d8d3ad0f062b65c0f | 7.475184 | apache-2.0 | 5 | 0 | true | false | false | false | 0.66869 | 0.267081 | 26.708134 | 0.323153 | 6.058845 | 0.012085 | 1.208459 | 0.263423 | 1.789709 | 0.342125 | 1.432292 | 0.168883 | 7.653664 | false | false | 2024-09-26 | 2024-09-29 | 1 | KingNish/Qwen2.5-0.5b-Test-ft (Merge) |