File size: 1,702 Bytes
2d23516
 
 
3bbda77
 
2d23516
 
 
 
3bbda77
 
 
 
 
d1aa4da
 
2d23516
 
 
 
 
 
d1aa4da
16eae7f
8fa809f
711c2dd
d1aa4da
320b3ab
 
cb34d18
 
16eae7f
 
2d23516
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
3bbda77
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
---
base_model:
- CoolSpring/Qwen2-0.5B-Abyme-merge3
- >-
  FlofloB/100k_fineweb_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit
library_name: transformers
tags:
- mergekit
- merge
- rp
- roleplay
language:
- es
- en
datasets:
- HuggingFaceFW/fineweb
---
# merge

This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).

## Merge Details

<center><b>The Brothers</b>

[Abyss](https://huggingface.co/Novaciano/Qwen2.5-0.5B-Abyss) & [Cliff](https://huggingface.co/Novaciano/Qwen2.5-0.5B-Cliff)

<img src="https://i.ibb.co/prXNXXN7/IMG-20250311-031904.jpg" alt="IMG-20250311-031904" border="0"></a>

Both is the combination of the best models QWEN2.5-0.5B of the Open LLM Leaderscore.

</center>

### Merge Method

This model was merged using the [Arcee Fusion](https://arcee.ai) merge method using [CoolSpring/Qwen2-0.5B-Abyme-merge3](https://huggingface.co/CoolSpring/Qwen2-0.5B-Abyme-merge3) as a base.

### Models Merged

The following models were included in the merge:
* [FlofloB/100k_fineweb_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit](https://huggingface.co/FlofloB/100k_fineweb_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit)

### Configuration

The following YAML configuration was used to produce this model:

```yaml
models:
  - model: FlofloB/100k_fineweb_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit
  - model: CoolSpring/Qwen2-0.5B-Abyme-merge3
    parameters:
      density: 0.53
      weight: 0.6
merge_method: arcee_fusion
base_model: CoolSpring/Qwen2-0.5B-Abyme-merge3
tokenizer_source: union
parameters:
  int8_mask: true
dtype: bfloat16
random_seed: 0
```