Khetterman commited on
Commit
16ad524
·
verified ·
1 Parent(s): d9e5d8d

Create README.md

Browse files

![KosmosLogo256.png](https://cdn-uploads.huggingface.co/production/uploads/673125091920e70ac26c8a2e/9Whmlrrnv49oULY1-kuMW.png)

Files changed (1) hide show
  1. README.md +151 -0
README.md ADDED
@@ -0,0 +1,151 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model:
3
+ - Khetterman/CursedMatrix-8B-v9
4
+ - aloobun/CosmicBun-8B-DPO
5
+ - Arkana08/LexiMaid-L3-8B
6
+ - Arkana08/Mythorica-L3-8B
7
+ - bluuwhale/L3-SthenoMaidBlackroot-8B-V1
8
+ - Casual-Autopsy/L3-Luna-8B
9
+ - IlyaGusev/saiga_llama3_8b
10
+ - invisietch/L3.1-EtherealRainbow-v1.0-rc1-8B
11
+ - jeiku/Average_Normie_v3.69_8B
12
+ - SicariusSicariiStuff/LLAMA-3_8B_Unaligned_BETA
13
+ - v000000/L3-8B-BlueSerpentine
14
+ - ZeroXClem/L3SAO-Mix-SuperHermes-NovaPurosani-8B
15
+ - ZeroXClem/Llama-3-Aetheric-Hermes-Lexi-Smaug-8B
16
+ - ZeroXClem/Llama3.1-TheiaFire-DarkFusion-8B
17
+ library_name: transformers
18
+ tags:
19
+ - mergekit
20
+ - merge
21
+ - bfloat16
22
+ - safetensors
23
+ - 8b
24
+ - chat
25
+ - creative
26
+ - roleplay
27
+ - conversational
28
+ - not-for-all-audiences
29
+ language:
30
+ - en
31
+ - ru
32
+
33
+ ---
34
+ # Kosmos-8B-v1
35
+
36
+ >The serenity of infinity is not the end.
37
+
38
+ ![KosmosLogo256.png](https://cdn-uploads.huggingface.co/production/uploads/673125091920e70ac26c8a2e/9Whmlrrnv49oULY1-kuMW.png)
39
+
40
+ This is an interesting merge of **14 cool models**, created using [mergekit](https://github.com/arcee-ai/mergekit).
41
+ Enjoy exploring :)
42
+
43
+ ## Merge Details
44
+ ### Method
45
+
46
+ This model was merged using the multistep process and remerge with some model variations for best result.
47
+
48
+ ### Models
49
+
50
+ The following models were included in the merge:
51
+
52
+ * [Khetterman/CursedMatrix-8B-v9](https://huggingface.co/Khetterman/CursedMatrix-8B-v9)
53
+ * [aloobun/CosmicBun-8B-DPO](https://huggingface.co/aloobun/CosmicBun-8B-DPO)
54
+ * [Arkana08/LexiMaid-L3-8B](https://huggingface.co/Arkana08/LexiMaid-L3-8B)
55
+ * [Arkana08/Mythorica-L3-8B](https://huggingface.co/Arkana08/Mythorica-L3-8B)
56
+ * [bluuwhale/L3-SthenoMaidBlackroot-8B-V1](https://huggingface.co/bluuwhale/L3-SthenoMaidBlackroot-8B-V1)
57
+ * [Casual-Autopsy/L3-Luna-8B](https://huggingface.co/Casual-Autopsy/L3-Luna-8B)
58
+ * [IlyaGusev/saiga_llama3_8b](https://huggingface.co/IlyaGusev/saiga_llama3_8b)
59
+ * [invisietch/L3.1-EtherealRainbow-v1.0-rc1-8B](https://huggingface.co/invisietch/L3.1-EtherealRainbow-v1.0-rc1-8B)
60
+ * [jeiku/Average_Normie_v3.69_8B](https://huggingface.co/jeiku/Average_Normie_v3.69_8B)
61
+ * [SicariusSicariiStuff/LLAMA-3_8B_Unaligned_BETA](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned_BETA)
62
+ * [v000000/L3-8B-BlueSerpentine](https://huggingface.co/v000000/L3-8B-BlueSerpentine)
63
+ * [ZeroXClem/L3SAO-Mix-SuperHermes-NovaPurosani-8B](https://huggingface.co/ZeroXClem/L3SAO-Mix-SuperHermes-NovaPurosani-8B)
64
+ * [ZeroXClem/Llama-3-Aetheric-Hermes-Lexi-Smaug-8B](https://huggingface.co/ZeroXClem/Llama-3-Aetheric-Hermes-Lexi-Smaug-8B)
65
+ * [ZeroXClem/Llama3.1-TheiaFire-DarkFusion-8B](https://huggingface.co/ZeroXClem/Llama3.1-TheiaFire-DarkFusion-8B)
66
+
67
+ ### Configuration
68
+
69
+ The following YAML configurations was used to produce this model:
70
+
71
+ ```yaml
72
+ # Cursed-UnalignedCosmicSaiga-8B-v1
73
+ models:
74
+ - model: SicariusSicariiStuff/LLAMA-3_8B_Unaligned_BETA
75
+ - model: aloobun/CosmicBun-8B-DPO
76
+ - model: IlyaGusev/saiga_llama3_8b
77
+ merge_method: model_stock
78
+ base_model: Khetterman/CursedMatrix-8B-v9
79
+ dtype: bfloat16
80
+
81
+ # Cursed-BlueRainbowMaid-8B-v1
82
+ models:
83
+ - model: v000000/L3-8B-BlueSerpentine
84
+ - model: invisietch/L3.1-EtherealRainbow-v1.0-rc1-8B
85
+ - model: bluuwhale/L3-SthenoMaidBlackroot-8B-V1
86
+ merge_method: model_stock
87
+ base_model: Khetterman/CursedMatrix-8B-v9
88
+ dtype: bfloat16
89
+
90
+ # Cursed-AverageLunaFusion-8B-v1
91
+ models:
92
+ - model: jeiku/Average_Normie_v3.69_8B
93
+ - model: Casual-Autopsy/L3-Luna-8B
94
+ - model: ZeroXClem/Llama3.1-TheiaFire-DarkFusion-8B
95
+ merge_method: model_stock
96
+ base_model: Khetterman/CursedMatrix-8B-v9
97
+ dtype: bfloat16
98
+
99
+ # InfectedKosmos-8B-v1
100
+ models:
101
+ - model: F:/Cursed-UnalignedCosmicSaiga-8B-v1
102
+ - model: F:/Cursed-BlueRainbowMaid-8B-v1
103
+ - model: F:/Cursed-AverageLunaFusion-8B-v1
104
+ merge_method: model_stock
105
+ base_model: Khetterman/CursedMatrix-8B-v9
106
+ dtype: bfloat16
107
+
108
+ # ZeroArkana-A
109
+ models:
110
+ - model: ZeroXClem/L3SAO-Mix-SuperHermes-NovaPurosani-8B
111
+ parameters:
112
+ weight: [0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50]
113
+ density: [0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50]
114
+ - model: Arkana08/LexiMaid-L3-8B
115
+ parameters:
116
+ weight: [0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50]
117
+ density: [0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50]
118
+ merge_method: della
119
+ parameters:
120
+ epsilon: 0.1
121
+ lambda: 1.0
122
+ base_model: F:/InfectedKosmos-8B-v1
123
+ dtype: bfloat16
124
+
125
+ # ZeroArkana-B
126
+ models:
127
+ - model: ZeroXClem/Llama-3-Aetheric-Hermes-Lexi-Smaug-8B
128
+ parameters:
129
+ weight: [0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50]
130
+ density: [0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50]
131
+ - model: Arkana08/Mythorica-L3-8B
132
+ parameters:
133
+ weight: [0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50]
134
+ density: [0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50]
135
+ merge_method: della
136
+ parameters:
137
+ epsilon: 0.1
138
+ lambda: 1.0
139
+ base_model: F:/InfectedKosmos-8B-v1
140
+ dtype: bfloat16
141
+
142
+ # Kosmos-8B-v1
143
+ models:
144
+ - model: F:/ZeroArkana-A
145
+ - model: F:/ZeroArkana-B
146
+ merge_method: model_stock
147
+ base_model: F:/InfectedKosmos-8B-v1
148
+ dtype: bfloat16
149
+ ```
150
+
151
+ >My thanks to the authors of the original models, your work is incredible. Have a good time 🖤