|
--- |
|
base_model: |
|
- Etherll/Qwen2.5-7B-della-test |
|
- marcuscedricridia/olmner-7b |
|
- ehristoforu/fq2.5-7b-it-normalize_false |
|
library_name: transformers |
|
tags: |
|
- mergekit |
|
- merge |
|
|
|
--- |
|
π OLMNER-DELLA-7B Benchmark Results |
|
|
|
OLMNER-DELLA-7B takes OLMNER to the next level by merging it with a **DELLA-optimized model**, achieving even stronger results! |
|
|
|
## π₯ Performance Scores |
|
|
|
| Metric | Score | |
|
|----------------------|--------| |
|
| **Average Score** | **36.35%** | |
|
| **IFEval** | **76.37%** | |
|
| **BBH** | **35.90%** | |
|
| **MATH** | **49.62%** | |
|
| **GPQA** | **6.82%** | |
|
| **MUSR** | **11.80%** | |
|
| **MMLU-PRO** | **37.62%** | |
|
|
|
π **Carbon Emission Estimate:** **0.65 kg COβ** |
|
### Configuration |
|
|
|
The following YAML configuration was used to produce this model: |
|
|
|
```yaml |
|
models: |
|
- model: ehristoforu/fq2.5-7b-it-normalize_false |
|
#no parameters necessary for base model |
|
- model: Etherll/Qwen2.5-7B-della-test |
|
parameters: |
|
density: 0.5 |
|
weight: 0.5 |
|
- model: marcuscedricridia/olmner-7b |
|
parameters: |
|
density: 0.5 |
|
weight: 0.5 |
|
|
|
merge_method: ties |
|
base_model: ehristoforu/fq2.5-7b-it-normalize_false |
|
parameters: |
|
normalize: false |
|
int8_mask: true |
|
dtype: bfloat16 |
|
``` |
|
|