SOLAR-10.7B
Model Details
- Base Model: yanolja/KoSOLAR-10.7B-v0.1
Datasets
- sampling and translate Open-Orca/SlimOrca
- sampling and translate Anthropic/hh-rlhf
Benchmark
- SOTA model as of Jan 1, 2024 (https://huggingface.co/spaces/upstage/open-ko-llm-leaderboard).
Model | Average | Ko-ARC | Ko-HellaSwag | Ko-MMLU | Ko-TruthfulQA | Ko-CommonGen V2 |
---|---|---|---|---|---|---|
hyeogi/SOLAR-10.7B-dpo-v0.1 (Ours) | 56.29 | 47.95 | 59.49 | 51.29 | 60.97 | 61.75 |
jeonsworld/CarbonVillain-10.7B-v1 | 55.33 | 49.91 | 60.65 | 55.04 | 48.22 | 62.81 |
Megastudy/M-SOLAR-10.7B-v1.1-beta | 55.25 | 51.71 | 60.86 | 54.24 | 47.12 | 62.34 |
- Downloads last month
- 64
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.