eval_name
stringlengths 12
111
| Precision
stringclasses 3
values | Type
stringclasses 7
values | T
stringclasses 7
values | Weight type
stringclasses 3
values | Architecture
stringclasses 63
values | Model
stringlengths 355
689
| fullname
stringlengths 4
102
| Model sha
stringlengths 0
40
| Average ⬆️
float64 0.74
52
| Hub License
stringclasses 27
values | Hub ❤️
int64 0
6k
| #Params (B)
float64 -1
141
| Available on the hub
bool 2
classes | MoE
bool 2
classes | Flagged
bool 2
classes | Chat Template
bool 2
classes | CO₂ cost (kg)
float64 0.03
107
| IFEval Raw
float64 0
0.9
| IFEval
float64 0
90
| BBH Raw
float64 0.24
0.75
| BBH
float64 0.25
64.1
| MATH Lvl 5 Raw
float64 0
0.52
| MATH Lvl 5
float64 0
52.4
| GPQA Raw
float64 0.21
0.47
| GPQA
float64 0
29.4
| MUSR Raw
float64 0.29
0.6
| MUSR
float64 0
38.5
| MMLU-PRO Raw
float64 0.1
0.73
| MMLU-PRO
float64 0
70
| Merged
bool 2
classes | Official Providers
bool 2
classes | Upload To Hub Date
stringclasses 488
values | Submission Date
stringclasses 228
values | Generation
int64 0
10
| Base Model
stringlengths 4
102
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
princeton-nlp_Llama-3-Instruct-8B-RRHF-v0.2_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-RRHF-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-RRHF-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-RRHF-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Llama-3-Instruct-8B-RRHF-v0.2 | 81191fbb214d17f0a4fec247da5d648f4cb61ef1 | 23.753751 | 0 | 8.03 | true | false | false | true | 0.505873 | 0.712488 | 71.248842 | 0.49839 | 28.498724 | 0.087613 | 8.761329 | 0.260067 | 1.342282 | 0.373781 | 5.089323 | 0.348238 | 27.582004 | false | false | 2024-07-06 | 2024-10-07 | 0 | princeton-nlp/Llama-3-Instruct-8B-RRHF-v0.2 |
|
princeton-nlp_Llama-3-Instruct-8B-SLiC-HF_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-SLiC-HF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-SLiC-HF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-SLiC-HF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Llama-3-Instruct-8B-SLiC-HF | 7e9001f6f4fe940c363bb7ea1814d33c79b21737 | 25.056382 | 0 | 8.03 | true | false | false | true | 0.725192 | 0.739966 | 73.996551 | 0.502942 | 29.211612 | 0.082326 | 8.232628 | 0.286074 | 4.809843 | 0.372292 | 5.369792 | 0.358461 | 28.717863 | false | false | 2024-07-06 | 2024-10-07 | 0 | princeton-nlp/Llama-3-Instruct-8B-SLiC-HF |
|
princeton-nlp_Llama-3-Instruct-8B-SLiC-HF-v0.2_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-SLiC-HF-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-SLiC-HF-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-SLiC-HF-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Llama-3-Instruct-8B-SLiC-HF-v0.2 | 1821cc42189d8dab9e157c31b223dc60fc037c2d | 23.728355 | 0 | 8.03 | true | false | false | true | 0.521239 | 0.710965 | 71.096468 | 0.49839 | 28.498724 | 0.087613 | 8.761329 | 0.260067 | 1.342282 | 0.373781 | 5.089323 | 0.348238 | 27.582004 | false | false | 2024-07-06 | 2024-10-07 | 0 | princeton-nlp/Llama-3-Instruct-8B-SLiC-HF-v0.2 |
|
princeton-nlp_Llama-3-Instruct-8B-SimPO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-SimPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-SimPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-SimPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Llama-3-Instruct-8B-SimPO | f700cb6afb4509b10dea43ab72bb0e260e166be4 | 22.657116 | 57 | 8.03 | true | false | false | true | 0.533346 | 0.65039 | 65.038985 | 0.484468 | 26.709133 | 0.02568 | 2.567976 | 0.293624 | 5.816555 | 0.394833 | 8.154167 | 0.348903 | 27.655881 | false | false | 2024-05-17 | 2024-09-28 | 0 | princeton-nlp/Llama-3-Instruct-8B-SimPO |
|
princeton-nlp_Llama-3-Instruct-8B-SimPO-v0.2_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-SimPO-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-SimPO-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-SimPO-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Llama-3-Instruct-8B-SimPO-v0.2 | 9ac0fbee445e7755e50520e9881d67588b4b854c | 24.474601 | 6 | 8.03 | true | false | false | true | 0.579982 | 0.680865 | 68.086455 | 0.503834 | 29.214022 | 0.057402 | 5.740181 | 0.301174 | 6.823266 | 0.398802 | 7.85026 | 0.362201 | 29.133422 | false | false | 2024-07-06 | 2024-09-28 | 0 | princeton-nlp/Llama-3-Instruct-8B-SimPO-v0.2 |
|
princeton-nlp_Mistral-7B-Base-SFT-CPO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Base-SFT-CPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Base-SFT-CPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Base-SFT-CPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Mistral-7B-Base-SFT-CPO | 7f67394668b94a9ddfb64daff8976b48b135d96c | 17.373794 | 1 | 7.242 | true | false | false | true | 0.809769 | 0.465493 | 46.549267 | 0.438215 | 21.857696 | 0.026435 | 2.643505 | 0.291946 | 5.592841 | 0.407083 | 9.252083 | 0.265126 | 18.34737 | false | false | 2024-07-06 | 2024-10-07 | 0 | princeton-nlp/Mistral-7B-Base-SFT-CPO |
|
princeton-nlp_Mistral-7B-Base-SFT-DPO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Base-SFT-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Base-SFT-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Base-SFT-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Mistral-7B-Base-SFT-DPO | 17134fd80cfbf3980353967a30dc6f450f18f78f | 16.236325 | 0 | 7.242 | true | false | false | true | 0.66762 | 0.440338 | 44.03383 | 0.435011 | 20.79098 | 0.016616 | 1.661631 | 0.272651 | 3.020134 | 0.412229 | 9.628646 | 0.264545 | 18.282728 | false | false | 2024-05-17 | 2024-10-07 | 0 | princeton-nlp/Mistral-7B-Base-SFT-DPO |
|
princeton-nlp_Mistral-7B-Base-SFT-IPO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Base-SFT-IPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Base-SFT-IPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Base-SFT-IPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Mistral-7B-Base-SFT-IPO | eea781724e4d2ab8bdda7c13526f042de4cfae41 | 17.210428 | 0 | 7.242 | true | false | false | true | 0.667334 | 0.482953 | 48.295301 | 0.445802 | 23.703491 | 0.024924 | 2.492447 | 0.280201 | 4.026846 | 0.377625 | 4.836458 | 0.279172 | 19.908023 | false | false | 2024-05-17 | 2024-10-07 | 0 | princeton-nlp/Mistral-7B-Base-SFT-IPO |
|
princeton-nlp_Mistral-7B-Base-SFT-KTO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Base-SFT-KTO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Base-SFT-KTO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Base-SFT-KTO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Mistral-7B-Base-SFT-KTO | 02148bb9241b0f4bb0c75e93893eed005abe25e8 | 18.96264 | 0 | 7.242 | true | false | false | true | 0.666017 | 0.478482 | 47.848154 | 0.447643 | 23.107642 | 0.036254 | 3.625378 | 0.290268 | 5.369128 | 0.436781 | 13.03099 | 0.287151 | 20.794548 | false | false | 2024-05-17 | 2024-10-07 | 0 | princeton-nlp/Mistral-7B-Base-SFT-KTO |
|
princeton-nlp_Mistral-7B-Base-SFT-RDPO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Base-SFT-RDPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Base-SFT-RDPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Base-SFT-RDPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Mistral-7B-Base-SFT-RDPO | 2a63a6d9e1978c99444e440371268f7c2b7e0375 | 16.465757 | 0 | 7.242 | true | false | false | true | 0.662505 | 0.460647 | 46.064664 | 0.443953 | 22.98201 | 0.020393 | 2.039275 | 0.277685 | 3.691275 | 0.357938 | 4.275521 | 0.277676 | 19.7418 | false | false | 2024-05-17 | 2024-10-07 | 0 | princeton-nlp/Mistral-7B-Base-SFT-RDPO |
|
princeton-nlp_Mistral-7B-Base-SFT-RRHF_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Base-SFT-RRHF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Base-SFT-RRHF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Base-SFT-RRHF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Mistral-7B-Base-SFT-RRHF | 0d5861072e9d01f420451bf6a5b108bc8d3a76bc | 16.194613 | 0 | 7.242 | true | false | false | true | 0.669001 | 0.440663 | 44.0663 | 0.428059 | 19.598831 | 0.02568 | 2.567976 | 0.290268 | 5.369128 | 0.418677 | 10.034635 | 0.239777 | 15.530807 | false | false | 2024-07-06 | 2024-10-07 | 0 | princeton-nlp/Mistral-7B-Base-SFT-RRHF |
|
princeton-nlp_Mistral-7B-Base-SFT-SLiC-HF_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Base-SFT-SLiC-HF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Base-SFT-SLiC-HF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Base-SFT-SLiC-HF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Mistral-7B-Base-SFT-SLiC-HF | 65d2cc49ad05258da3d982b39682c7f672f5e4ab | 18.955533 | 0 | 7.242 | true | false | false | true | 0.668442 | 0.512728 | 51.272845 | 0.44224 | 22.304723 | 0.032477 | 3.247734 | 0.291946 | 5.592841 | 0.426083 | 11.527083 | 0.278092 | 19.787973 | false | false | 2024-07-06 | 2024-10-07 | 0 | princeton-nlp/Mistral-7B-Base-SFT-SLiC-HF |
|
princeton-nlp_Mistral-7B-Base-SFT-SimPO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Base-SFT-SimPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Base-SFT-SimPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Base-SFT-SimPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Mistral-7B-Base-SFT-SimPO | 9d9e8b8de4f673d45bc826efc4a1444f9d480222 | 16.893545 | 0 | 7.242 | true | false | false | true | 0.635706 | 0.470064 | 47.006387 | 0.439805 | 22.332886 | 0.006042 | 0.60423 | 0.283557 | 4.474273 | 0.397063 | 8.032813 | 0.270196 | 18.910683 | false | false | 2024-05-17 | 2024-09-21 | 0 | princeton-nlp/Mistral-7B-Base-SFT-SimPO |
|
princeton-nlp_Mistral-7B-Instruct-CPO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-CPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-CPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-CPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Mistral-7B-Instruct-CPO | 32492f8e5588f06005689ac944c2ea39c394c28e | 15.565535 | 0 | 7.242 | true | false | false | true | 0.645922 | 0.420305 | 42.030479 | 0.406922 | 17.248538 | 0.021903 | 2.190332 | 0.26594 | 2.12528 | 0.417844 | 10.897135 | 0.270113 | 18.901448 | false | false | 2024-07-06 | 2024-10-07 | 0 | princeton-nlp/Mistral-7B-Instruct-CPO |
|
princeton-nlp_Mistral-7B-Instruct-DPO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Mistral-7B-Instruct-DPO | 5e96cff70d8db87cf17c616429c17c8dc9352543 | 16.549607 | 0 | 7.242 | true | false | false | true | 0.605267 | 0.517624 | 51.762435 | 0.406036 | 16.875389 | 0.030211 | 3.021148 | 0.268456 | 2.46085 | 0.383333 | 5.75 | 0.27485 | 19.427822 | false | false | 2024-05-17 | 2024-10-07 | 0 | princeton-nlp/Mistral-7B-Instruct-DPO |
|
princeton-nlp_Mistral-7B-Instruct-IPO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-IPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-IPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-IPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Mistral-7B-Instruct-IPO | 32ad99c6e7231bbe8ebd9d24b28e084c60848558 | 17.707096 | 0 | 7.242 | true | false | false | true | 0.625748 | 0.49292 | 49.29199 | 0.432218 | 20.09411 | 0.019637 | 1.963746 | 0.27349 | 3.131991 | 0.432417 | 12.785417 | 0.270778 | 18.975325 | false | false | 2024-05-17 | 2024-10-07 | 0 | princeton-nlp/Mistral-7B-Instruct-IPO |
|
princeton-nlp_Mistral-7B-Instruct-KTO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-KTO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-KTO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-KTO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Mistral-7B-Instruct-KTO | 834422e5b9b9eee6aac2f8d4822b925a6574d628 | 16.664827 | 0 | 7.242 | true | false | false | true | 0.603378 | 0.490797 | 49.079664 | 0.413959 | 17.812648 | 0.024169 | 2.416918 | 0.27349 | 3.131991 | 0.395271 | 7.408854 | 0.28125 | 20.138889 | false | false | 2024-05-17 | 2024-10-07 | 0 | princeton-nlp/Mistral-7B-Instruct-KTO |
|
princeton-nlp_Mistral-7B-Instruct-ORPO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-ORPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-ORPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-ORPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Mistral-7B-Instruct-ORPO | 69c0481f4100629a49ae73f760ddbb61d8e98e48 | 16.050529 | 0 | 7.242 | true | false | false | true | 0.624297 | 0.471962 | 47.196217 | 0.410406 | 18.038373 | 0.02719 | 2.719033 | 0.274329 | 3.243848 | 0.39124 | 6.638281 | 0.266207 | 18.46742 | false | false | 2024-05-17 | 2024-10-07 | 0 | princeton-nlp/Mistral-7B-Instruct-ORPO |
|
princeton-nlp_Mistral-7B-Instruct-RDPO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-RDPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-RDPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-RDPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Mistral-7B-Instruct-RDPO | 23ec6ab4f996134eb15c19322dabb34d7332d7cd | 16.420491 | 0 | 7.242 | true | false | false | true | 0.610616 | 0.488723 | 48.872325 | 0.405015 | 17.048388 | 0.024169 | 2.416918 | 0.280201 | 4.026846 | 0.387333 | 6.416667 | 0.277676 | 19.7418 | false | false | 2024-05-17 | 2024-10-07 | 0 | princeton-nlp/Mistral-7B-Instruct-RDPO |
|
princeton-nlp_Mistral-7B-Instruct-RRHF_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-RRHF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-RRHF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-RRHF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Mistral-7B-Instruct-RRHF | 493d3ceb571232fe3b2f55c0bf78692760f4fc7e | 16.829083 | 0 | 7.242 | true | false | false | true | 0.587751 | 0.496017 | 49.601723 | 0.418977 | 19.206552 | 0.024169 | 2.416918 | 0.276007 | 3.467562 | 0.397875 | 7.934375 | 0.265126 | 18.34737 | false | false | 2024-07-06 | 2024-10-07 | 0 | princeton-nlp/Mistral-7B-Instruct-RRHF |
|
princeton-nlp_Mistral-7B-Instruct-SLiC-HF_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-SLiC-HF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-SLiC-HF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-SLiC-HF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Mistral-7B-Instruct-SLiC-HF | 3d08c8b7c3e73beb2a3264848f17246b74c3d162 | 16.376556 | 0 | 7.242 | true | false | false | true | 0.622453 | 0.511529 | 51.152941 | 0.404001 | 16.653429 | 0.016616 | 1.661631 | 0.272651 | 3.020134 | 0.391302 | 6.71276 | 0.271526 | 19.058437 | false | false | 2024-07-06 | 2024-10-16 | 0 | princeton-nlp/Mistral-7B-Instruct-SLiC-HF |
|
princeton-nlp_Mistral-7B-Instruct-SimPO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-SimPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-SimPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-SimPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Mistral-7B-Instruct-SimPO | 03191ee1e60d21a698d11a515703a037073724f8 | 17.569551 | 2 | 7.242 | false | false | false | true | 0.570562 | 0.46869 | 46.868974 | 0.450723 | 22.382277 | 0.026435 | 2.643505 | 0.278523 | 3.803132 | 0.409781 | 9.75599 | 0.279671 | 19.963431 | false | false | 2024-05-24 | 2024-09-21 | 0 | princeton-nlp/Mistral-7B-Instruct-SimPO |
|
princeton-nlp_Sheared-LLaMA-1.3B_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Sheared-LLaMA-1.3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Sheared-LLaMA-1.3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Sheared-LLaMA-1.3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Sheared-LLaMA-1.3B | a4b76938edbf571ea7d7d9904861cbdca08809b4 | 5.505397 | apache-2.0 | 93 | 1.3 | true | false | false | false | 0.3546 | 0.21977 | 21.977021 | 0.319705 | 4.74463 | 0.008308 | 0.830816 | 0.239933 | 0 | 0.371302 | 3.579427 | 0.117104 | 1.900488 | false | false | 2023-10-10 | 2024-07-29 | 0 | princeton-nlp/Sheared-LLaMA-1.3B |
princeton-nlp_Sheared-LLaMA-2.7B_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/Sheared-LLaMA-2.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Sheared-LLaMA-2.7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Sheared-LLaMA-2.7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/Sheared-LLaMA-2.7B | 2f157a0306b75d37694ae05f6a4067220254d540 | 6.324627 | apache-2.0 | 60 | 2.7 | true | false | false | false | 0.47005 | 0.241652 | 24.165215 | 0.325869 | 5.655521 | 0.006042 | 0.60423 | 0.275168 | 3.355705 | 0.356729 | 2.091146 | 0.118684 | 2.075946 | false | false | 2023-10-10 | 2024-07-29 | 0 | princeton-nlp/Sheared-LLaMA-2.7B |
princeton-nlp_gemma-2-9b-it-DPO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/gemma-2-9b-it-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/gemma-2-9b-it-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__gemma-2-9b-it-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/gemma-2-9b-it-DPO | f646c99fc3aa7afc7b22c3c7115fd03a40fc1d22 | 19.434035 | 8 | 9.242 | false | false | false | true | 2.890627 | 0.276872 | 27.687203 | 0.594144 | 41.593654 | 0 | 0 | 0.33557 | 11.409396 | 0.382031 | 5.653906 | 0.37234 | 30.260047 | false | false | 2024-07-16 | 2024-09-19 | 2 | google/gemma-2-9b |
|
princeton-nlp_gemma-2-9b-it-SimPO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/princeton-nlp/gemma-2-9b-it-SimPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/gemma-2-9b-it-SimPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__gemma-2-9b-it-SimPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | princeton-nlp/gemma-2-9b-it-SimPO | 8c87091f412e3aa6f74f66bd86c57fb81cbc3fde | 21.161652 | mit | 144 | 9.242 | true | false | false | true | 2.769004 | 0.320686 | 32.068578 | 0.583918 | 40.09343 | 0 | 0 | 0.33557 | 11.409396 | 0.412323 | 10.340365 | 0.397523 | 33.058141 | false | false | 2024-07-16 | 2024-08-10 | 2 | google/gemma-2-9b |
prithivMLmods_Bellatrix-1.5B-xElite_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Bellatrix-1.5B-xElite" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Bellatrix-1.5B-xElite</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Bellatrix-1.5B-xElite-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Bellatrix-1.5B-xElite | 4ec39cef1bf7701abb30dda694b4918c517d1c0d | 9.547601 | apache-2.0 | 8 | 1.777 | true | false | false | false | 0.599664 | 0.196414 | 19.64144 | 0.35012 | 9.486709 | 0.126133 | 12.613293 | 0.278523 | 3.803132 | 0.361906 | 4.438281 | 0.165725 | 7.302748 | false | false | 2025-01-25 | 2025-01-27 | 1 | prithivMLmods/Bellatrix-1.5B-xElite (Merge) |
prithivMLmods_Bellatrix-Tiny-1.5B-R1_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Bellatrix-Tiny-1.5B-R1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Bellatrix-Tiny-1.5B-R1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Bellatrix-Tiny-1.5B-R1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Bellatrix-Tiny-1.5B-R1 | db777568b86dc8aebb654b9167497912e004843e | 14.184095 | apache-2.0 | 8 | 1.544 | true | false | false | false | 0.584481 | 0.335225 | 33.522498 | 0.402217 | 15.85758 | 0.052115 | 5.21148 | 0.298658 | 6.487696 | 0.368292 | 4.569792 | 0.2751 | 19.455526 | false | false | 2025-01-31 | 2025-02-02 | 1 | prithivMLmods/Bellatrix-Tiny-1.5B-R1 (Merge) |
prithivMLmods_Bellatrix-Tiny-1B-v2_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Bellatrix-Tiny-1B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Bellatrix-Tiny-1B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Bellatrix-Tiny-1B-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Bellatrix-Tiny-1B-v2 | d82282c0853688ed16e3b9e121a09d063c566cc5 | 5.970924 | llama3.2 | 8 | 1.236 | true | false | false | false | 0.386866 | 0.150952 | 15.09517 | 0.326768 | 6.032562 | 0.024924 | 2.492447 | 0.272651 | 3.020134 | 0.343021 | 3.710937 | 0.149269 | 5.474291 | false | false | 2025-01-26 | 2025-01-27 | 1 | prithivMLmods/Bellatrix-Tiny-1B-v2 (Merge) |
prithivMLmods_Blaze-14B-xElite_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Blaze-14B-xElite" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Blaze-14B-xElite</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Blaze-14B-xElite-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Blaze-14B-xElite | 1795ffecee7322e697edfd0f900c7155ae2878b9 | 28.946758 | llama3.1 | 7 | 14.66 | true | false | false | false | 0.940855 | 0.03632 | 3.63203 | 0.662782 | 51.573264 | 0.358761 | 35.876133 | 0.394295 | 19.239374 | 0.46249 | 17.677865 | 0.511137 | 45.681885 | false | false | 2025-01-28 | 2025-01-28 | 0 | prithivMLmods/Blaze-14B-xElite |
prithivMLmods_COCO-7B-Instruct-1M_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/COCO-7B-Instruct-1M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/COCO-7B-Instruct-1M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__COCO-7B-Instruct-1M-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/COCO-7B-Instruct-1M | a8ccc848bd1db0f05172a4e1c2197a0d3b4f25c5 | 28.171845 | apache-2.0 | 8 | 7.616 | true | false | false | false | 0.669052 | 0.47431 | 47.431039 | 0.540996 | 34.677883 | 0.30287 | 30.287009 | 0.307886 | 7.718121 | 0.43824 | 13.513281 | 0.418634 | 35.403738 | false | false | 2025-01-25 | 2025-01-27 | 1 | prithivMLmods/COCO-7B-Instruct-1M (Merge) |
prithivMLmods_Calcium-Opus-14B-Elite_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Calcium-Opus-14B-Elite" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Calcium-Opus-14B-Elite</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Calcium-Opus-14B-Elite-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Calcium-Opus-14B-Elite | a8661f82079677c777595e4259dbaf5a72c8f134 | 38.377957 | apache-2.0 | 10 | 14.766 | true | false | false | false | 2.012399 | 0.605152 | 60.515211 | 0.631736 | 46.934158 | 0.376888 | 37.688822 | 0.374161 | 16.55481 | 0.485958 | 20.778125 | 0.53017 | 47.796616 | false | false | 2025-01-23 | 2025-01-23 | 1 | prithivMLmods/Calcium-Opus-14B-Elite (Merge) |
prithivMLmods_Calcium-Opus-14B-Elite_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Calcium-Opus-14B-Elite" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Calcium-Opus-14B-Elite</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Calcium-Opus-14B-Elite-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Calcium-Opus-14B-Elite | a8661f82079677c777595e4259dbaf5a72c8f134 | 38.249365 | apache-2.0 | 10 | 14.766 | true | false | false | false | 2.022333 | 0.606351 | 60.635115 | 0.62959 | 46.532809 | 0.370846 | 37.084592 | 0.373322 | 16.442953 | 0.487323 | 20.948698 | 0.530668 | 47.852024 | false | false | 2025-01-23 | 2025-01-23 | 1 | prithivMLmods/Calcium-Opus-14B-Elite (Merge) |
prithivMLmods_Calcium-Opus-14B-Elite-1M_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Calcium-Opus-14B-Elite-1M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Calcium-Opus-14B-Elite-1M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Calcium-Opus-14B-Elite-1M-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Calcium-Opus-14B-Elite-1M | 07f093df0a87d5d13e4325aa54eb62de9322721c | 35.110144 | apache-2.0 | 10 | 14.77 | true | false | false | false | 1.946804 | 0.561288 | 56.128849 | 0.63294 | 46.935523 | 0.295317 | 29.531722 | 0.352349 | 13.646532 | 0.467604 | 18.283854 | 0.515209 | 46.134382 | false | false | 2025-01-25 | 2025-01-27 | 1 | prithivMLmods/Calcium-Opus-14B-Elite-1M (Merge) |
prithivMLmods_Calcium-Opus-14B-Elite-Stock_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Calcium-Opus-14B-Elite-Stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Calcium-Opus-14B-Elite-Stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Calcium-Opus-14B-Elite-Stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Calcium-Opus-14B-Elite-Stock | e3b7fa2d20fa3e7a92bb7a99ad05219c9a86a95d | 36.4921 | 8 | 14.766 | false | false | false | false | 1.98723 | 0.614295 | 61.429452 | 0.632877 | 46.897899 | 0.271903 | 27.190332 | 0.368289 | 15.771812 | 0.48075 | 20.060417 | 0.528424 | 47.602689 | false | false | 2025-01-25 | 2025-01-25 | 1 | prithivMLmods/Calcium-Opus-14B-Elite-Stock (Merge) |
|
prithivMLmods_Calcium-Opus-14B-Elite2_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Calcium-Opus-14B-Elite2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Calcium-Opus-14B-Elite2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Calcium-Opus-14B-Elite2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Calcium-Opus-14B-Elite2 | 0d948a368ff62658c06f90219849d8a6be29b78e | 38.449708 | apache-2.0 | 8 | 14.766 | true | false | false | false | 2.012723 | 0.617617 | 61.761681 | 0.631826 | 46.80615 | 0.361027 | 36.102719 | 0.369966 | 15.995526 | 0.493958 | 22.244792 | 0.530086 | 47.787382 | false | false | 2025-01-24 | 2025-01-25 | 1 | prithivMLmods/Calcium-Opus-14B-Elite2 (Merge) |
prithivMLmods_Calcium-Opus-14B-Elite2-R1_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Calcium-Opus-14B-Elite2-R1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Calcium-Opus-14B-Elite2-R1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Calcium-Opus-14B-Elite2-R1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Calcium-Opus-14B-Elite2-R1 | 8d57bcd85bdfe2cb41f0e84ceb7beabcdc1e63fb | 37.97209 | apache-2.0 | 13 | 14.766 | true | false | false | false | 1.839015 | 0.632579 | 63.257933 | 0.636236 | 47.337096 | 0.298338 | 29.833837 | 0.39094 | 18.791946 | 0.48999 | 21.415365 | 0.524767 | 47.196365 | false | false | 2025-02-01 | 2025-02-02 | 1 | prithivMLmods/Calcium-Opus-14B-Elite2-R1 (Merge) |
prithivMLmods_Calcium-Opus-14B-Elite3_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Calcium-Opus-14B-Elite3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Calcium-Opus-14B-Elite3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Calcium-Opus-14B-Elite3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Calcium-Opus-14B-Elite3 | 6be2c8ea522ff941fa1ed5bec18949ac4c3b5651 | 35.857734 | apache-2.0 | 8 | 14.766 | true | false | false | false | 2.012315 | 0.542829 | 54.282858 | 0.63504 | 47.0746 | 0.293807 | 29.380665 | 0.370805 | 16.107383 | 0.479479 | 20.134896 | 0.533494 | 48.166002 | false | false | 2025-01-25 | 2025-01-25 | 1 | prithivMLmods/Calcium-Opus-14B-Elite3 (Merge) |
prithivMLmods_Calcium-Opus-14B-Elite4_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Calcium-Opus-14B-Elite4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Calcium-Opus-14B-Elite4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Calcium-Opus-14B-Elite4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Calcium-Opus-14B-Elite4 | 59525af6aae57e700ff9cd6ce9c6b3257f422f4c | 34.540949 | apache-2.0 | 9 | 14.766 | true | false | false | false | 1.958301 | 0.611197 | 61.119718 | 0.619526 | 45.208475 | 0.230363 | 23.036254 | 0.355705 | 14.09396 | 0.468719 | 17.689844 | 0.514877 | 46.097444 | false | false | 2025-01-25 | 2025-01-25 | 1 | prithivMLmods/Calcium-Opus-14B-Elite4 (Merge) |
prithivMLmods_Calcium-Opus-14B-Merge_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Calcium-Opus-14B-Merge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Calcium-Opus-14B-Merge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Calcium-Opus-14B-Merge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Calcium-Opus-14B-Merge | ceb41ff76990a24d2f4ff29f1c342fcd7322948a | 35.795652 | 8 | 14.766 | false | false | false | false | 2.069258 | 0.494943 | 49.494342 | 0.631929 | 46.766668 | 0.330816 | 33.081571 | 0.370805 | 16.107383 | 0.486083 | 20.927083 | 0.535572 | 48.396868 | false | false | 2025-01-24 | 2025-01-24 | 1 | prithivMLmods/Calcium-Opus-14B-Merge (Merge) |
|
prithivMLmods_Calcium-Opus-20B-v1_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Calcium-Opus-20B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Calcium-Opus-20B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Calcium-Opus-20B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Calcium-Opus-20B-v1 | 28395429552eb6f22cd3dc8b54cd03e47c6132c9 | 26.849891 | apache-2.0 | 8 | 19.173 | true | false | false | false | 2.73627 | 0.309272 | 30.927162 | 0.599033 | 41.805576 | 0.110272 | 11.02719 | 0.353188 | 13.758389 | 0.494333 | 22.091667 | 0.473404 | 41.489362 | false | false | 2025-01-19 | 2025-01-23 | 1 | prithivMLmods/Calcium-Opus-20B-v1 (Merge) |
prithivMLmods_Codepy-Deepthink-3B_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Codepy-Deepthink-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Codepy-Deepthink-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Codepy-Deepthink-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Codepy-Deepthink-3B | 73551f0560645b098ff8293e70ff633bfc72c125 | 17.367825 | creativeml-openrail-m | 8 | 3.213 | true | false | false | false | 0.605503 | 0.43272 | 43.271963 | 0.425945 | 18.640888 | 0.111782 | 11.178248 | 0.279362 | 3.914989 | 0.331021 | 3.977604 | 0.309009 | 23.223257 | false | false | 2024-12-26 | 2025-01-12 | 1 | prithivMLmods/Codepy-Deepthink-3B (Merge) |
prithivMLmods_Deepthink-Reasoning-14B_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Deepthink-Reasoning-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Deepthink-Reasoning-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Deepthink-Reasoning-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Deepthink-Reasoning-14B | 08fd00d4ac2bf07766c8bab7e73d17028487d23a | 34.795154 | apache-2.0 | 10 | 14.77 | true | false | false | false | 1.950391 | 0.542354 | 54.235429 | 0.633405 | 47.306257 | 0.244713 | 24.471299 | 0.366611 | 15.548098 | 0.473156 | 19.477865 | 0.529588 | 47.731974 | false | false | 2025-01-20 | 2025-01-22 | 1 | prithivMLmods/Deepthink-Reasoning-14B (Merge) |
prithivMLmods_Deepthink-Reasoning-7B_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Deepthink-Reasoning-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Deepthink-Reasoning-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Deepthink-Reasoning-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Deepthink-Reasoning-7B | 0ccaa3825ded55cf8cfa18f7db53d91848e3733b | 26.894145 | creativeml-openrail-m | 16 | 7.616 | true | false | false | false | 0.626998 | 0.484002 | 48.400245 | 0.550507 | 35.623731 | 0.200906 | 20.090634 | 0.299497 | 6.599553 | 0.443229 | 13.436979 | 0.434924 | 37.213726 | false | false | 2024-12-28 | 2025-01-09 | 1 | prithivMLmods/Deepthink-Reasoning-7B (Merge) |
prithivMLmods_FastThink-0.5B-Tiny_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/FastThink-0.5B-Tiny" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/FastThink-0.5B-Tiny</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__FastThink-0.5B-Tiny-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/FastThink-0.5B-Tiny | c07fd949ceba096d7c2e405bcfce99e269f7ca39 | 7.252605 | apache-2.0 | 9 | 0.494 | true | false | false | false | 0.537537 | 0.257989 | 25.79888 | 0.320558 | 5.01961 | 0.004532 | 0.453172 | 0.260906 | 1.454139 | 0.356635 | 3.579427 | 0.164894 | 7.210402 | false | false | 2025-01-20 | 2025-01-24 | 1 | prithivMLmods/FastThink-0.5B-Tiny (Merge) |
prithivMLmods_GWQ-9B-Preview_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/GWQ-9B-Preview" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/GWQ-9B-Preview</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__GWQ-9B-Preview-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/GWQ-9B-Preview | 5a0e00ac0ff885f54ef32e607508895bae864006 | 29.915362 | gemma | 9 | 9.242 | true | false | false | false | 2.461162 | 0.506584 | 50.658364 | 0.580575 | 40.669723 | 0.212236 | 21.223565 | 0.339765 | 11.96868 | 0.495104 | 21.821354 | 0.398354 | 33.150488 | false | false | 2025-01-04 | 2025-01-08 | 0 | prithivMLmods/GWQ-9B-Preview |
prithivMLmods_GWQ-9B-Preview2_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/GWQ-9B-Preview2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/GWQ-9B-Preview2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__GWQ-9B-Preview2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/GWQ-9B-Preview2 | 42f5d4f7d19eb59c9408ff70cdbc30459ec1ad3d | 29.870954 | creativeml-openrail-m | 16 | 9.242 | true | false | false | false | 2.452824 | 0.520897 | 52.089678 | 0.579722 | 40.184861 | 0.226586 | 22.65861 | 0.326342 | 10.178971 | 0.48599 | 20.815365 | 0.399684 | 33.298242 | false | false | 2025-01-04 | 2025-01-08 | 1 | prithivMLmods/GWQ-9B-Preview2 (Merge) |
prithivMLmods_GWQ2b_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/GWQ2b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/GWQ2b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__GWQ2b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/GWQ2b | 1d2a808ec30008a2cba697b1bb742ab67efb71f0 | 16.404535 | gemma | 10 | 2.614 | true | false | false | false | 1.204289 | 0.411487 | 41.148708 | 0.414337 | 17.68035 | 0.061178 | 6.117825 | 0.282718 | 4.362416 | 0.431115 | 12.75599 | 0.247257 | 16.361924 | false | false | 2025-01-09 | 2025-01-12 | 1 | prithivMLmods/GWQ2b (Merge) |
prithivMLmods_Llama-3.1-5B-Instruct_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Llama-3.1-5B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Llama-3.1-5B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Llama-3.1-5B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Llama-3.1-5B-Instruct | 310ab744cd88aecedc534abd373d2f66a0c82f19 | 3.955411 | llama3.1 | 7 | 5.413 | true | false | false | false | 0.499552 | 0.14066 | 14.066012 | 0.305107 | 3.109216 | 0 | 0 | 0.264262 | 1.901566 | 0.354 | 2.616667 | 0.118351 | 2.039007 | false | false | 2025-01-04 | 2025-01-12 | 0 | prithivMLmods/Llama-3.1-5B-Instruct |
prithivMLmods_Llama-3.1-8B-Open-SFT_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Llama-3.1-8B-Open-SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Llama-3.1-8B-Open-SFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Llama-3.1-8B-Open-SFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Llama-3.1-8B-Open-SFT | e5d7fa281735f7fcc09fdb5810a2118789040d67 | 20.942999 | creativeml-openrail-m | 11 | 8.03 | true | false | false | false | 0.727794 | 0.412262 | 41.226169 | 0.496798 | 28.179928 | 0.115559 | 11.555891 | 0.309564 | 7.941834 | 0.390365 | 8.728906 | 0.352227 | 28.025266 | false | false | 2024-12-18 | 2025-01-12 | 1 | prithivMLmods/Llama-3.1-8B-Open-SFT (Merge) |
prithivMLmods_Llama-3.2-3B-Math-Oct_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Llama-3.2-3B-Math-Oct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Llama-3.2-3B-Math-Oct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Llama-3.2-3B-Math-Oct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Llama-3.2-3B-Math-Oct | 5d72ae9689eb8307a741c6e7a455e427a792cd15 | 17.416778 | llama3.2 | 7 | 3.213 | true | false | false | false | 0.595683 | 0.458523 | 45.852338 | 0.437184 | 19.94675 | 0.114048 | 11.404834 | 0.258389 | 1.118568 | 0.34699 | 4.940365 | 0.29114 | 21.23781 | false | false | 2025-01-22 | 2025-01-24 | 1 | prithivMLmods/Llama-3.2-3B-Math-Oct (Merge) |
prithivMLmods_Llama-3.2-6B-AlgoCode_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Llama-3.2-6B-AlgoCode" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Llama-3.2-6B-AlgoCode</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Llama-3.2-6B-AlgoCode-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Llama-3.2-6B-AlgoCode | e111d34ff9033fe36b4f1c283a17d017b4e4e5c6 | 9.250648 | llama3.2 | 7 | 6.339 | true | false | false | false | 0.77736 | 0.213576 | 21.357554 | 0.374774 | 11.602526 | 0.010574 | 1.057402 | 0.286913 | 4.9217 | 0.401344 | 7.701302 | 0.179771 | 8.863401 | false | false | 2025-01-10 | 2025-01-12 | 0 | prithivMLmods/Llama-3.2-6B-AlgoCode |
prithivMLmods_Llama-8B-Distill-CoT_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Llama-8B-Distill-CoT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Llama-8B-Distill-CoT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Llama-8B-Distill-CoT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Llama-8B-Distill-CoT | 4c2d02c2cd92f4c371547201027202ac42d88a71 | 19.107331 | llama3.1 | 10 | 8.03 | true | false | false | false | 0.717502 | 0.334151 | 33.415116 | 0.429762 | 19.595123 | 0.30136 | 30.135952 | 0.28943 | 5.257271 | 0.371979 | 6.997396 | 0.273188 | 19.243129 | false | false | 2025-01-21 | 2025-01-22 | 1 | prithivMLmods/Llama-8B-Distill-CoT (Merge) |
prithivMLmods_Llama-Deepsync-1B_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Llama-Deepsync-1B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Llama-Deepsync-1B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Llama-Deepsync-1B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Llama-Deepsync-1B | 03a9a38ffbb49f0f176a901a5fab3e444d6131fe | 10.118362 | creativeml-openrail-m | 9 | 1.236 | true | false | false | false | 0.377137 | 0.357007 | 35.700719 | 0.338563 | 7.763873 | 0.034743 | 3.47432 | 0.260067 | 1.342282 | 0.35651 | 4.230469 | 0.173787 | 8.198508 | false | false | 2024-12-29 | 2025-01-12 | 1 | prithivMLmods/Llama-Deepsync-1B (Merge) |
prithivMLmods_Llama-Deepsync-3B_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Llama-Deepsync-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Llama-Deepsync-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Llama-Deepsync-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Llama-Deepsync-3B | 9f7c81f997f9a35797b511197e48a64ffb6d046f | 17.063214 | creativeml-openrail-m | 15 | 3.213 | true | false | false | false | 0.608077 | 0.430222 | 43.022181 | 0.429152 | 18.963664 | 0.111027 | 11.102719 | 0.271812 | 2.908277 | 0.332385 | 3.814844 | 0.303108 | 22.567598 | false | false | 2024-12-29 | 2025-01-12 | 1 | prithivMLmods/Llama-Deepsync-3B (Merge) |
prithivMLmods_Llama-Express.1-Math_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Llama-Express.1-Math" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Llama-Express.1-Math</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Llama-Express.1-Math-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Llama-Express.1-Math | 9c32d92f0ef3a4c4935992c9a5074d7a65ea91bc | 12.082506 | llama3.2 | 7 | 1.236 | true | false | false | true | 0.356045 | 0.508432 | 50.843207 | 0.336381 | 7.19902 | 0.050604 | 5.060423 | 0.263423 | 1.789709 | 0.314344 | 0.826302 | 0.160987 | 6.776374 | false | false | 2025-01-21 | 2025-01-25 | 1 | prithivMLmods/Llama-Express.1-Math (Merge) |
prithivMLmods_LwQ-10B-Instruct_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/LwQ-10B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/LwQ-10B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__LwQ-10B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/LwQ-10B-Instruct | 3db52014aba9ec7163c28af47aac1f07af8fe0f6 | 20.866756 | llama3.1 | 7 | 10.732 | true | false | false | false | 0.725175 | 0.393477 | 39.347709 | 0.512171 | 31.590273 | 0.033988 | 3.398792 | 0.312081 | 8.277405 | 0.454396 | 16.832813 | 0.331782 | 25.753546 | false | false | 2025-01-14 | 2025-01-19 | 1 | prithivMLmods/LwQ-10B-Instruct (Merge) |
prithivMLmods_LwQ-Reasoner-10B_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/LwQ-Reasoner-10B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/LwQ-Reasoner-10B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__LwQ-Reasoner-10B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/LwQ-Reasoner-10B | fcd46007bd9f098004843dd79042a99543a22293 | 26.730028 | llama3.1 | 8 | 10.306 | true | false | false | false | 0.894598 | 0.294134 | 29.413401 | 0.586625 | 40.337248 | 0.342145 | 34.214502 | 0.346477 | 12.863535 | 0.407854 | 8.581771 | 0.414727 | 34.96971 | false | false | 2025-01-18 | 2025-01-19 | 1 | prithivMLmods/LwQ-Reasoner-10B (Merge) |
prithivMLmods_Megatron-Opus-14B-Exp_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Megatron-Opus-14B-Exp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Megatron-Opus-14B-Exp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Megatron-Opus-14B-Exp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Megatron-Opus-14B-Exp | d6c56465b7610abbebbf6cdedae6fda92087fbfc | 36.92701 | apache-2.0 | 9 | 14.766 | true | false | false | false | 1.914034 | 0.497941 | 49.794102 | 0.651609 | 50.002879 | 0.351208 | 35.120846 | 0.375 | 16.666667 | 0.488656 | 21.082031 | 0.54006 | 48.895538 | false | false | 2025-02-03 | 2025-02-03 | 1 | prithivMLmods/Megatron-Opus-14B-Exp (Merge) |
prithivMLmods_Megatron-Opus-14B-Stock_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Megatron-Opus-14B-Stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Megatron-Opus-14B-Stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Megatron-Opus-14B-Stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Megatron-Opus-14B-Stock | a9d75a507fb0e9320e70c120b0f1823cc377cea2 | 36.200447 | 8 | 14.766 | false | false | false | false | 1.825261 | 0.517375 | 51.737501 | 0.641175 | 48.128851 | 0.327795 | 32.779456 | 0.375 | 16.666667 | 0.482021 | 20.185937 | 0.529338 | 47.70427 | false | false | 2025-02-03 | 2025-02-03 | 1 | prithivMLmods/Megatron-Opus-14B-Stock (Merge) |
|
prithivMLmods_Megatron-Opus-7B-Exp_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Megatron-Opus-7B-Exp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Megatron-Opus-7B-Exp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Megatron-Opus-7B-Exp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Megatron-Opus-7B-Exp | 1856f046b2fe15ccf1baac686aa4595ab4245f86 | 27.39114 | llama3.1 | 7 | 7.456 | true | false | false | false | 0.598404 | 0.60173 | 60.173008 | 0.536715 | 34.371535 | 0.183535 | 18.353474 | 0.311242 | 8.165548 | 0.418583 | 11.05625 | 0.390043 | 32.227024 | false | false | 2025-02-03 | 2025-02-03 | 0 | prithivMLmods/Megatron-Opus-7B-Exp |
prithivMLmods_Omni-Reasoner-Merged_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Omni-Reasoner-Merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Omni-Reasoner-Merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Omni-Reasoner-Merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Omni-Reasoner-Merged | 5c34ad1b2510c510025ac724a16bed7f5ae5f1c3 | 28.429225 | 9 | 7.616 | false | false | false | false | 0.631416 | 0.459947 | 45.994738 | 0.550785 | 35.361777 | 0.284743 | 28.47432 | 0.303691 | 7.158837 | 0.461646 | 16.205729 | 0.43642 | 37.37995 | false | false | 2025-01-16 | 2025-01-17 | 1 | prithivMLmods/Omni-Reasoner-Merged (Merge) |
|
prithivMLmods_Omni-Reasoner3-Merged_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Omni-Reasoner3-Merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Omni-Reasoner3-Merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Omni-Reasoner3-Merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Omni-Reasoner3-Merged | a8fbe5740e04a78661dedd16597fa4d5a135ad95 | 18.421149 | 7 | 3.213 | false | false | false | false | 0.5883 | 0.49347 | 49.346955 | 0.438785 | 20.586522 | 0.108006 | 10.800604 | 0.264262 | 1.901566 | 0.352229 | 6.228646 | 0.294963 | 21.662603 | false | false | 2025-01-17 | 2025-01-17 | 1 | prithivMLmods/Omni-Reasoner3-Merged (Merge) |
|
prithivMLmods_Phi-4-Empathetic_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Phi-4-Empathetic" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Phi-4-Empathetic</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Phi-4-Empathetic-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Phi-4-Empathetic | 181a87cfc05f0ee538b14cf4a773ad3b816224fe | 28.158044 | mit | 13 | 14.66 | true | false | false | false | 0.897638 | 0.049659 | 4.965935 | 0.672682 | 52.838938 | 0.259063 | 25.906344 | 0.380034 | 17.337808 | 0.499135 | 22.72526 | 0.506566 | 45.17398 | false | false | 2025-01-10 | 2025-01-12 | 1 | prithivMLmods/Phi-4-Empathetic (Merge) |
prithivMLmods_Phi-4-Math-IO_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Phi-4-Math-IO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Phi-4-Math-IO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Phi-4-Math-IO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Phi-4-Math-IO | 2e3f81b0c1613d33a4b0e216120fa3a3dd9206f8 | 25.804663 | mit | 9 | 14.66 | true | false | false | false | 0.966947 | 0.058977 | 5.897685 | 0.666826 | 52.093771 | 0.096677 | 9.667674 | 0.39849 | 19.798658 | 0.487292 | 20.644792 | 0.520529 | 46.725399 | false | false | 2025-01-10 | 2025-01-12 | 1 | prithivMLmods/Phi-4-Math-IO (Merge) |
prithivMLmods_Phi-4-QwQ_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Phi-4-QwQ" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Phi-4-QwQ</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Phi-4-QwQ-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Phi-4-QwQ | f9d9cc11a7c9e56420b705ac97f06362321dd89a | 29.047166 | mit | 13 | 14.66 | true | false | false | false | 0.985764 | 0.055929 | 5.592938 | 0.669557 | 52.28685 | 0.324773 | 32.477341 | 0.39094 | 18.791946 | 0.465063 | 17.632813 | 0.52751 | 47.501108 | false | false | 2025-01-10 | 2025-01-12 | 1 | prithivMLmods/Phi-4-QwQ (Merge) |
prithivMLmods_Phi-4-Super_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Phi-4-Super" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Phi-4-Super</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Phi-4-Super-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Phi-4-Super | d0632dd9df3d6a8ae4f10f2185d38eeb61cab9d2 | 30.274078 | 8 | 14.66 | false | false | false | false | 0.962645 | 0.048136 | 4.813561 | 0.672012 | 52.697295 | 0.342145 | 34.214502 | 0.394295 | 19.239374 | 0.504375 | 23.280208 | 0.526596 | 47.399527 | false | false | 2025-01-23 | 2025-01-24 | 1 | prithivMLmods/Phi-4-Super (Merge) |
|
prithivMLmods_Phi-4-Super-1_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Phi-4-Super-1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Phi-4-Super-1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Phi-4-Super-1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Phi-4-Super-1 | 081e3442df878853ab8bd765430c961658ce5024 | 30.104386 | 9 | 14.66 | false | false | false | false | 0.935598 | 0.041766 | 4.176585 | 0.672934 | 52.905831 | 0.344411 | 34.441088 | 0.393456 | 19.127517 | 0.50174 | 22.917448 | 0.523521 | 47.057846 | false | false | 2025-01-24 | 2025-01-24 | 1 | prithivMLmods/Phi-4-Super-1 (Merge) |
|
prithivMLmods_Phi-4-Super-o1_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Phi-4-Super-o1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Phi-4-Super-o1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Phi-4-Super-o1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Phi-4-Super-o1 | 081e3442df878853ab8bd765430c961658ce5024 | 30.104386 | 9 | 14.66 | false | false | false | false | 0.964152 | 0.041766 | 4.176585 | 0.672934 | 52.905831 | 0.344411 | 34.441088 | 0.393456 | 19.127517 | 0.50174 | 22.917448 | 0.523521 | 47.057846 | false | false | 2025-01-24 | 2025-01-24 | 1 | prithivMLmods/Phi-4-Super-o1 (Merge) |
|
prithivMLmods_Phi-4-o1_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Phi-4-o1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Phi-4-o1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Phi-4-o1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Phi-4-o1 | aa2a7571e9dbce0fefe98479fe04f298f2491b8c | 30.116173 | mit | 26 | 14.66 | true | false | false | false | 0.869083 | 0.028976 | 2.897645 | 0.668873 | 52.170862 | 0.39426 | 39.425982 | 0.38255 | 17.673378 | 0.497771 | 22.154687 | 0.51737 | 46.374483 | false | false | 2025-01-08 | 2025-01-09 | 1 | prithivMLmods/Phi-4-o1 (Merge) |
prithivMLmods_Phi4-Super_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Phi4-Super" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Phi4-Super</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Phi4-Super-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Phi4-Super | d27188b144a6ac8c2d70f761e8afd8b05c74fd16 | 30.274078 | 8 | 14.66 | false | false | false | false | 0.918532 | 0.048136 | 4.813561 | 0.672012 | 52.697295 | 0.342145 | 34.214502 | 0.394295 | 19.239374 | 0.504375 | 23.280208 | 0.526596 | 47.399527 | false | false | 2025-01-23 | 2025-01-23 | 1 | prithivMLmods/Phi4-Super (Merge) |
|
prithivMLmods_Primal-Opus-14B-Optimus-v1_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Primal-Opus-14B-Optimus-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Primal-Opus-14B-Optimus-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Primal-Opus-14B-Optimus-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Primal-Opus-14B-Optimus-v1 | 240fd09db4f126801d50bd74a74700927918c2d4 | 35.963707 | apache-2.0 | 9 | 14.766 | true | false | false | false | 1.989405 | 0.501313 | 50.131318 | 0.641942 | 48.271703 | 0.332326 | 33.232628 | 0.372483 | 16.331096 | 0.484719 | 20.489844 | 0.525931 | 47.32565 | false | false | 2025-02-02 | 2025-02-03 | 1 | prithivMLmods/Primal-Opus-14B-Optimus-v1 (Merge) |
prithivMLmods_QwQ-LCoT-14B-Conversational_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/QwQ-LCoT-14B-Conversational" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/QwQ-LCoT-14B-Conversational</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__QwQ-LCoT-14B-Conversational-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/QwQ-LCoT-14B-Conversational | 60ef4aa0a2660f9b6f28a3de773729969a1df9ae | 33.165443 | apache-2.0 | 9 | 14.77 | true | false | false | false | 1.954452 | 0.404743 | 40.474275 | 0.623983 | 45.62626 | 0.314199 | 31.41994 | 0.349832 | 13.310962 | 0.484719 | 20.623177 | 0.527842 | 47.538047 | false | false | 2025-01-18 | 2025-01-19 | 1 | prithivMLmods/QwQ-LCoT-14B-Conversational (Merge) |
prithivMLmods_QwQ-LCoT-3B-Instruct_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/QwQ-LCoT-3B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/QwQ-LCoT-3B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__QwQ-LCoT-3B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/QwQ-LCoT-3B-Instruct | 1f47223ac1c6069c3e53b75a45ad496f0fb9a124 | 21.012747 | creativeml-openrail-m | 10 | 3.086 | true | false | false | false | 0.765394 | 0.435442 | 43.54424 | 0.476298 | 26.621188 | 0.101964 | 10.196375 | 0.281879 | 4.250559 | 0.435792 | 12.773958 | 0.358211 | 28.69016 | false | false | 2024-12-12 | 2025-01-12 | 1 | prithivMLmods/QwQ-LCoT-3B-Instruct (Merge) |
prithivMLmods_QwQ-LCoT-7B-Instruct_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/QwQ-LCoT-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/QwQ-LCoT-7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__QwQ-LCoT-7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/QwQ-LCoT-7B-Instruct | 06f0076fcf5cb72222513e6c76bd33e1ebaa97b7 | 28.132178 | creativeml-openrail-m | 23 | 7.616 | true | false | false | false | 0.650305 | 0.49869 | 49.869014 | 0.546647 | 34.780933 | 0.207704 | 20.770393 | 0.302013 | 6.935123 | 0.480188 | 19.390104 | 0.433428 | 37.047503 | false | false | 2024-12-14 | 2025-01-07 | 1 | prithivMLmods/QwQ-LCoT-7B-Instruct (Merge) |
prithivMLmods_QwQ-LCoT1-Merged_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/QwQ-LCoT1-Merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/QwQ-LCoT1-Merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__QwQ-LCoT1-Merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/QwQ-LCoT1-Merged | d85a4f359bc568afb7b1a2a6e6503934bb352ab6 | 28.38084 | 8 | 7.616 | false | false | false | false | 0.65538 | 0.475135 | 47.513486 | 0.548096 | 35.166254 | 0.249245 | 24.924471 | 0.307047 | 7.606264 | 0.469615 | 17.76849 | 0.435755 | 37.306073 | false | false | 2025-01-21 | 2025-01-22 | 1 | prithivMLmods/QwQ-LCoT1-Merged (Merge) |
|
prithivMLmods_QwQ-LCoT2-7B-Instruct_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/QwQ-LCoT2-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/QwQ-LCoT2-7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__QwQ-LCoT2-7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/QwQ-LCoT2-7B-Instruct | f2ea462f6d3f6cf104313b1329909cb15a388841 | 28.574182 | apache-2.0 | 10 | 7.616 | true | false | false | false | 1.365323 | 0.556118 | 55.611777 | 0.542486 | 34.366737 | 0.222054 | 22.205438 | 0.297819 | 6.375839 | 0.456438 | 15.754688 | 0.434176 | 37.130615 | false | false | 2025-01-20 | 2025-01-24 | 1 | prithivMLmods/QwQ-LCoT2-7B-Instruct (Merge) |
prithivMLmods_QwQ-MathOct-7B_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/QwQ-MathOct-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/QwQ-MathOct-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__QwQ-MathOct-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/QwQ-MathOct-7B | d2ff038987cc16a7b317034929dd9ab35265e308 | 27.918705 | apache-2.0 | 8 | 7.616 | true | false | false | false | 0.664523 | 0.46844 | 46.84404 | 0.548551 | 35.254667 | 0.260574 | 26.057402 | 0.302852 | 7.04698 | 0.460063 | 15.307812 | 0.433012 | 37.00133 | false | false | 2025-01-11 | 2025-01-19 | 1 | prithivMLmods/QwQ-MathOct-7B (Merge) |
prithivMLmods_QwQ-R1-Distill-1.5B-CoT_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/QwQ-R1-Distill-1.5B-CoT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/QwQ-R1-Distill-1.5B-CoT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__QwQ-R1-Distill-1.5B-CoT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/QwQ-R1-Distill-1.5B-CoT | cd1a92a4fffbc923013e2a77d9d7f2c8b2a738ae | 12.458841 | apache-2.0 | 9 | 1.777 | true | false | false | false | 0.587416 | 0.219396 | 21.939565 | 0.366621 | 11.476456 | 0.246224 | 24.622356 | 0.286074 | 4.809843 | 0.343396 | 1.757812 | 0.191323 | 10.147015 | false | false | 2025-01-21 | 2025-01-22 | 1 | prithivMLmods/QwQ-R1-Distill-1.5B-CoT (Merge) |
prithivMLmods_QwQ-R1-Distill-7B-CoT_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/QwQ-R1-Distill-7B-CoT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/QwQ-R1-Distill-7B-CoT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__QwQ-R1-Distill-7B-CoT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/QwQ-R1-Distill-7B-CoT | db0c74ffe611d00eb0a5df4413f3eced7fdacb78 | 18.919333 | apache-2.0 | 9 | 7.616 | true | false | false | false | 0.67112 | 0.350038 | 35.00379 | 0.438789 | 20.953831 | 0.271903 | 27.190332 | 0.293624 | 5.816555 | 0.377906 | 4.504948 | 0.280419 | 20.046543 | false | false | 2025-01-21 | 2025-01-22 | 1 | prithivMLmods/QwQ-R1-Distill-7B-CoT (Merge) |
prithivMLmods_Qwen-7B-Distill-Reasoner_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Qwen-7B-Distill-Reasoner" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Qwen-7B-Distill-Reasoner</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Qwen-7B-Distill-Reasoner-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Qwen-7B-Distill-Reasoner | b83c5c3d748f756927b87ae978f94fdb033c526b | 18.425824 | apache-2.0 | 7 | 7.616 | true | false | false | false | 0.660404 | 0.339571 | 33.957123 | 0.440933 | 22.175998 | 0.21148 | 21.148036 | 0.327181 | 10.290828 | 0.365969 | 2.779427 | 0.281832 | 20.203531 | false | false | 2025-01-28 | 2025-01-28 | 1 | prithivMLmods/Qwen-7B-Distill-Reasoner (Merge) |
prithivMLmods_Qwen2.5-1.5B-DeepSeek-R1-Instruct_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Qwen2.5-1.5B-DeepSeek-R1-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Qwen2.5-1.5B-DeepSeek-R1-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Qwen2.5-1.5B-DeepSeek-R1-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Qwen2.5-1.5B-DeepSeek-R1-Instruct | ca8cf376e59e873d70f8b9dffcb19aecc9d32fab | 4.06773 | 7 | 1.777 | false | false | false | false | 0.610808 | 0.139686 | 13.968603 | 0.282437 | 1.361067 | 0 | 0 | 0.276007 | 3.467562 | 0.372354 | 4.244271 | 0.112284 | 1.364879 | false | false | 2025-01-29 | 2025-01-29 | 1 | prithivMLmods/Qwen2.5-1.5B-DeepSeek-R1-Instruct (Merge) |
|
prithivMLmods_Qwen2.5-14B-DeepSeek-R1-1M_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Qwen2.5-14B-DeepSeek-R1-1M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Qwen2.5-14B-DeepSeek-R1-1M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Qwen2.5-14B-DeepSeek-R1-1M-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Qwen2.5-14B-DeepSeek-R1-1M | bc7898d09ac620cf86afa3237daa8181f689345b | 31.035779 | 8 | 14.77 | false | false | false | true | 4.190395 | 0.419281 | 41.928084 | 0.593485 | 40.75991 | 0.314955 | 31.495468 | 0.332215 | 10.961969 | 0.460604 | 17.742188 | 0.489943 | 43.327054 | false | false | 2025-01-29 | 2025-02-01 | 1 | prithivMLmods/Qwen2.5-14B-DeepSeek-R1-1M (Merge) |
|
prithivMLmods_Qwen2.5-7B-DeepSeek-R1-1M_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Qwen2.5-7B-DeepSeek-R1-1M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Qwen2.5-7B-DeepSeek-R1-1M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Qwen2.5-7B-DeepSeek-R1-1M-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Qwen2.5-7B-DeepSeek-R1-1M | a42acdfe01bd887dde308deeb07d570979976838 | 5.131314 | 8 | 7.616 | false | false | false | false | 0.661391 | 0.186123 | 18.612282 | 0.312555 | 4.665735 | 0 | 0 | 0.261745 | 1.565996 | 0.341688 | 3.710937 | 0.120096 | 2.232934 | false | false | 2025-01-29 | 2025-01-29 | 1 | prithivMLmods/Qwen2.5-7B-DeepSeek-R1-1M (Merge) |
|
prithivMLmods_SmolLM2-CoT-360M_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/SmolLM2-CoT-360M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/SmolLM2-CoT-360M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__SmolLM2-CoT-360M-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/SmolLM2-CoT-360M | 474240d772fbb3b8da6f8eb47f32dd34c6b78baf | 5.724162 | apache-2.0 | 14 | 0.362 | true | false | false | false | 0.387752 | 0.221569 | 22.156877 | 0.31353 | 4.801205 | 0.006798 | 0.679758 | 0.236577 | 0 | 0.379396 | 5.757813 | 0.108544 | 0.94932 | false | false | 2025-01-05 | 2025-01-07 | 1 | prithivMLmods/SmolLM2-CoT-360M (Merge) |
prithivMLmods_Taurus-Opus-7B_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Taurus-Opus-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Taurus-Opus-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Taurus-Opus-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Taurus-Opus-7B | 4b9918fb7ed2a92bdb1beae11deb337a3745d053 | 26.064884 | apache-2.0 | 9 | 7.456 | true | false | false | false | 0.68181 | 0.422328 | 42.232831 | 0.536736 | 34.234016 | 0.227341 | 22.734139 | 0.326342 | 10.178971 | 0.439885 | 14.21901 | 0.395113 | 32.790337 | false | false | 2025-01-25 | 2025-01-27 | 1 | prithivMLmods/Taurus-Opus-7B (Merge) |
prithivMLmods_Triangulum-10B_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Triangulum-10B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Triangulum-10B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Triangulum-10B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Triangulum-10B | d3776fbe6bfc884f1380fe128223759d76214049 | 28.099256 | llama3.1 | 10 | 10.306 | true | false | false | false | 0.859414 | 0.322935 | 32.293537 | 0.596802 | 42.240747 | 0.3429 | 34.29003 | 0.354027 | 13.870246 | 0.41725 | 10.589583 | 0.417803 | 35.311392 | false | false | 2024-12-30 | 2025-01-07 | 1 | prithivMLmods/Triangulum-10B (Merge) |
prithivMLmods_Triangulum-5B_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Triangulum-5B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Triangulum-5B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Triangulum-5B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Triangulum-5B | 55e161fc171b17b3e6c15aef9d5318a51bdb48fb | 3.860735 | creativeml-openrail-m | 8 | 5.413 | true | false | false | false | 0.492429 | 0.128321 | 12.832063 | 0.312412 | 4.293502 | 0.001511 | 0.151057 | 0.255034 | 0.671141 | 0.344542 | 2.734375 | 0.12234 | 2.48227 | false | false | 2024-12-31 | 2025-01-07 | 1 | prithivMLmods/Triangulum-5B (Merge) |
prithivMLmods_Triangulum-v2-10B_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Triangulum-v2-10B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Triangulum-v2-10B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Triangulum-v2-10B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Triangulum-v2-10B | a23407caf6f232d305c5cdf1c802dfa430e57915 | 32.758311 | llama3.1 | 7 | 10.306 | true | false | false | false | 0.907631 | 0.670523 | 67.05231 | 0.606453 | 42.754726 | 0.240181 | 24.018127 | 0.337248 | 11.63311 | 0.428073 | 12.575781 | 0.446642 | 38.51581 | false | false | 2025-01-30 | 2025-01-31 | 0 | prithivMLmods/Triangulum-v2-10B |
prithivMLmods_Tulu-MathLingo-8B_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/prithivMLmods/Tulu-MathLingo-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Tulu-MathLingo-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Tulu-MathLingo-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | prithivMLmods/Tulu-MathLingo-8B | 0fb551a24dfe1a576e2c5118a7581588d339a2e7 | 21.596382 | creativeml-openrail-m | 9 | 8.03 | true | false | false | false | 0.841574 | 0.55894 | 55.894028 | 0.465881 | 24.703351 | 0.132931 | 13.293051 | 0.290268 | 5.369128 | 0.386427 | 7.603385 | 0.304438 | 22.715352 | false | false | 2024-12-23 | 2025-01-12 | 1 | prithivMLmods/Tulu-MathLingo-8B (Merge) |
pszemraj_Llama-3-6.3b-v0.1_bfloat16 | bfloat16 | 🟩 continuously pretrained | 🟩 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/pszemraj/Llama-3-6.3b-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pszemraj/Llama-3-6.3b-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pszemraj__Llama-3-6.3b-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | pszemraj/Llama-3-6.3b-v0.1 | 7000b39346162f95f19aa4ca3975242db61902d7 | 10.333954 | llama3 | 6 | 6.3 | true | false | false | false | 0.814463 | 0.10439 | 10.438969 | 0.419681 | 18.679996 | 0.018127 | 1.812689 | 0.283557 | 4.474273 | 0.390833 | 6.154167 | 0.283993 | 20.443632 | false | false | 2024-05-17 | 2024-06-26 | 1 | meta-llama/Meta-Llama-3-8B |
pszemraj_Mistral-v0.3-6B_bfloat16 | bfloat16 | 🟩 continuously pretrained | 🟩 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/pszemraj/Mistral-v0.3-6B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pszemraj/Mistral-v0.3-6B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pszemraj__Mistral-v0.3-6B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | pszemraj/Mistral-v0.3-6B | ae11a699012b83996361f04808f4d45debf3b01c | 10.046851 | apache-2.0 | 1 | 5.939 | true | false | false | false | 0.530539 | 0.245374 | 24.53745 | 0.377405 | 13.515091 | 0.009063 | 0.906344 | 0.265101 | 2.013423 | 0.390771 | 6.613021 | 0.214262 | 12.695774 | false | false | 2024-05-25 | 2024-06-26 | 2 | pszemraj/Mistral-7B-v0.3-prune6 (Merge) |
qingy2019_LLaMa_3.2_3B_Catalysts_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/qingy2019/LLaMa_3.2_3B_Catalysts" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2019/LLaMa_3.2_3B_Catalysts</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2019__LLaMa_3.2_3B_Catalysts-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | qingy2019/LLaMa_3.2_3B_Catalysts | 3f4a318114beb37f32a2c143cbd68b6d15d18164 | 19.628816 | apache-2.0 | 1 | 3 | true | false | false | false | 0.649834 | 0.49924 | 49.923979 | 0.446813 | 21.345401 | 0.111027 | 11.102719 | 0.288591 | 5.145414 | 0.378771 | 7.946354 | 0.300781 | 22.309028 | false | false | 2024-10-19 | 2024-10-29 | 2 | meta-llama/Llama-3.2-3B-Instruct |
qingy2019_OpenMath2-Llama3.1-8B_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/qingy2019/OpenMath2-Llama3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2019/OpenMath2-Llama3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2019__OpenMath2-Llama3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | qingy2019/OpenMath2-Llama3.1-8B | 38412f988f7688d884c9249b2a4e5cc76f98c1c6 | 8.987818 | 0 | 8 | false | false | false | false | 0.692806 | 0.233059 | 23.305939 | 0.409552 | 16.29437 | 0.041541 | 4.154079 | 0.265101 | 2.013423 | 0.343552 | 2.010677 | 0.155336 | 6.148419 | false | false | 2024-11-23 | 0 | Removed |
||
qingy2019_Oracle-14B_float16 | float16 | 🤝 base merges and moerges | 🤝 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/qingy2019/Oracle-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2019/Oracle-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2019__Oracle-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | qingy2019/Oracle-14B | 0154031aa9306aa98da156a0f3c8e10d9f1377f6 | 13.34025 | 0 | 13.668 | false | false | false | false | 1.393024 | 0.235832 | 23.583204 | 0.461158 | 23.18463 | 0.064199 | 6.41994 | 0.25755 | 1.006711 | 0.371667 | 10.491667 | 0.238198 | 15.355349 | false | false | 2024-11-23 | 0 | Removed |
||
qingy2019_Oracle-14B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/qingy2019/Oracle-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2019/Oracle-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2019__Oracle-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | qingy2019/Oracle-14B | 0154031aa9306aa98da156a0f3c8e10d9f1377f6 | 13.479724 | 0 | 13.668 | false | false | false | false | 1.368887 | 0.240079 | 24.007855 | 0.46223 | 23.301946 | 0.06571 | 6.570997 | 0.260906 | 1.454139 | 0.370333 | 10.225 | 0.237866 | 15.31841 | false | false | 2024-11-24 | 0 | Removed |
||
qingy2019_Qwen2.5-Math-14B-Instruct_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/qingy2019/Qwen2.5-Math-14B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2019/Qwen2.5-Math-14B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2019__Qwen2.5-Math-14B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | qingy2019/Qwen2.5-Math-14B-Instruct | 025d9637208b862c7b10b7590969fe6870ce01a0 | 36.70629 | apache-2.0 | 1 | 14 | true | false | false | false | 1.932827 | 0.606626 | 60.662597 | 0.635007 | 47.017086 | 0.284743 | 28.47432 | 0.372483 | 16.331096 | 0.475729 | 19.632812 | 0.533078 | 48.119829 | false | false | 2024-12-01 | 2024-12-01 | 3 | Qwen/Qwen2.5-14B |
qingy2019_Qwen2.5-Math-14B-Instruct_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/qingy2019/Qwen2.5-Math-14B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2019/Qwen2.5-Math-14B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2019__Qwen2.5-Math-14B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | qingy2019/Qwen2.5-Math-14B-Instruct | 025d9637208b862c7b10b7590969fe6870ce01a0 | 36.380504 | apache-2.0 | 1 | 14 | true | false | false | false | 1.971893 | 0.600531 | 60.053104 | 0.635649 | 47.065572 | 0.276435 | 27.643505 | 0.369128 | 15.883669 | 0.475667 | 19.425 | 0.53391 | 48.212175 | false | false | 2024-12-01 | 2024-12-01 | 3 | Qwen/Qwen2.5-14B |
qingy2019_Qwen2.5-Math-14B-Instruct-Alpha_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/qingy2019/Qwen2.5-Math-14B-Instruct-Alpha" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2019/Qwen2.5-Math-14B-Instruct-Alpha</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2019__Qwen2.5-Math-14B-Instruct-Alpha-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | qingy2019/Qwen2.5-Math-14B-Instruct-Alpha | e24aaa0779b576301bfb62b93789dea24ab10c88 | 35.456012 | apache-2.0 | 2 | 14 | true | false | false | false | 1.893142 | 0.598083 | 59.808309 | 0.637508 | 47.750108 | 0.231118 | 23.111782 | 0.369966 | 15.995526 | 0.464938 | 17.950521 | 0.533078 | 48.119829 | false | false | 2024-12-03 | 2024-12-03 | 2 | Qwen/Qwen2.5-14B |
qingy2019_Qwen2.5-Math-14B-Instruct-Pro_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/qingy2019/Qwen2.5-Math-14B-Instruct-Pro" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2019/Qwen2.5-Math-14B-Instruct-Pro</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2019__Qwen2.5-Math-14B-Instruct-Pro-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | qingy2019/Qwen2.5-Math-14B-Instruct-Pro | 295a9ce370c2bfeabe13f76d52c92f57ff6d0308 | 19.70776 | 0 | 14.766 | false | false | false | true | 1.659569 | 0.192168 | 19.216789 | 0.531869 | 33.036904 | 0.251511 | 25.151057 | 0.311242 | 8.165548 | 0.374031 | 4.253906 | 0.355801 | 28.422355 | false | false | 2024-12-03 | 2024-12-03 | 1 | qingy2019/Qwen2.5-Math-14B-Instruct-Pro (Merge) |