Text Generation
GGUF
English
Chinese
MOE
Qwen 2.5 MOE
Mixture of Experts
Uncensored
2X7B
deepseek
reasoning
thinking
creative
128k context
general usage
problem solving
brainstorming
solve riddles
story generation
plot generation
storytelling
fiction story
story
writing
fiction
Qwen 2.5
mergekit
Inference Endpoints
conversational
Update README.md
Browse files
README.md
CHANGED
@@ -30,7 +30,7 @@ tags:
|
|
30 |
pipeline_tag: text-generation
|
31 |
---
|
32 |
|
33 |
-
(quants uploading
|
34 |
|
35 |
<H2>Qwen2.5-MOE-2X7B-DeepSeek-Abliterated-Censored-19B-gguf</H2>
|
36 |
|
@@ -43,7 +43,7 @@ The model is just over 19B because of the unqiue "shared expert" (roughly 2.5 mo
|
|
43 |
|
44 |
The oddball configuration yields interesting "thinking/reasoning" which is stronger than either 7B model on its own.
|
45 |
|
46 |
-
|
47 |
|
48 |
This model can be used for all use cases, and is also (mostly) uncensored.
|
49 |
|
@@ -444,5 +444,3 @@ This is just the first person perspective of a possible 1000-word piece—"The S
|
|
444 |
|
445 |
Let me know if you'd like any more details—"the sky is swaying"—you can see how far it leans, how far it sways.
|
446 |
|
447 |
-
|
448 |
-
|
|
|
30 |
pipeline_tag: text-generation
|
31 |
---
|
32 |
|
33 |
+
(quants uploading)
|
34 |
|
35 |
<H2>Qwen2.5-MOE-2X7B-DeepSeek-Abliterated-Censored-19B-gguf</H2>
|
36 |
|
|
|
43 |
|
44 |
The oddball configuration yields interesting "thinking/reasoning" which is stronger than either 7B model on its own.
|
45 |
|
46 |
+
Five example generations at the bottom of this page.
|
47 |
|
48 |
This model can be used for all use cases, and is also (mostly) uncensored.
|
49 |
|
|
|
444 |
|
445 |
Let me know if you'd like any more details—"the sky is swaying"—you can see how far it leans, how far it sways.
|
446 |
|
|
|
|