Update README.md
Browse files
README.md
CHANGED
@@ -21,8 +21,10 @@ NOTE: Links to GGUFs below.
|
|
21 |
|
22 |
<B>Evolved "Dark Planet" Model X 8</B>
|
23 |
|
24 |
-
This model is based on the original "Llama 3 Dark Planet 8B" (<a href="https://huggingface.co/DavidAU/L3-Dark-Planet-8B-GGUF">GGUF</a> /
|
25 |
-
|
|
|
|
|
26 |
|
27 |
This model contains the eight best models from this process, with the very best as a "captain" of the "MOE" so to speak.
|
28 |
|
@@ -36,6 +38,12 @@ When all eight of these are activated...
|
|
36 |
|
37 |
I have included 3 example generations at the bottom of this page with varying level of "experts" used.
|
38 |
|
|
|
|
|
|
|
|
|
|
|
|
|
39 |
SIDE NOTE:
|
40 |
|
41 |
Uusually a "MOE" is constructed with different models, to give the "moe model" some of the best of each (or not) during generation.
|
@@ -130,7 +138,7 @@ IMATRIX GGUFS:
|
|
130 |
|
131 |
<B>Warning: Romance, Drama, Horror, Cursing, Gore, Intense - NOT for all audiences.</B>
|
132 |
|
133 |
-
EXAMPLE GENERATIONS for "L3-MOE-8X8B-Dark-Planet-8D-Mirrored-Chaos-47B":
|
134 |
|
135 |
Q2K GGUF - (lowest quality quant) - this will be your lowest quality output.
|
136 |
|
|
|
21 |
|
22 |
<B>Evolved "Dark Planet" Model X 8</B>
|
23 |
|
24 |
+
This model is based on the original "Llama 3 Dark Planet 8B" (<a href="https://huggingface.co/DavidAU/L3-Dark-Planet-8B-GGUF">GGUF</a> /
|
25 |
+
<a href="https://huggingface.co/DavidAU/L3-Dark-Planet-8B">SOURCE</a>) merge that has been "evolved" several times. Each "evolved"
|
26 |
+
version is then tested, if it is unique and/or removes certain negative attibutes and/or enhances certain positive attibutes,
|
27 |
+
it is kept otherwise it is deleted.
|
28 |
|
29 |
This model contains the eight best models from this process, with the very best as a "captain" of the "MOE" so to speak.
|
30 |
|
|
|
38 |
|
39 |
I have included 3 example generations at the bottom of this page with varying level of "experts" used.
|
40 |
|
41 |
+
In addition model:
|
42 |
+
|
43 |
+
[ https://huggingface.co/aifeifei798/DarkIdol-Llama-3.1-8B-Instruct-1.2-Uncensored ]
|
44 |
+
|
45 |
+
Was used to in a merge with the "Captain" of the MOE to both decensor the model more, and give it 128k context.
|
46 |
+
|
47 |
SIDE NOTE:
|
48 |
|
49 |
Uusually a "MOE" is constructed with different models, to give the "moe model" some of the best of each (or not) during generation.
|
|
|
138 |
|
139 |
<B>Warning: Romance, Drama, Horror, Cursing, Gore, Intense - NOT for all audiences.</B>
|
140 |
|
141 |
+
EXAMPLE GENERATIONS for "L3.1-MOE-8X8B-Dark-Planet-8D-Mirrored-Chaos-Uncensored-47B":
|
142 |
|
143 |
Q2K GGUF - (lowest quality quant) - this will be your lowest quality output.
|
144 |
|