Alsebay/NaruMOE-3x7B-v2 AWQ

Model Summary

A MoE model for Roleplaying. Since 7B model is small enough, we can combine them to a bigger model (Which CAN be smarter).

Adapte (some limited) TSF (Trans Sexual Fiction) content because I have include my pre-train model in.

Worse than V1 in logic, but better in expression.

Downloads last month
11
Safetensors
Model size
2.7B params
Tensor type
I32
·
FP16
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model authors have turned it off explicitly.

Collection including solidrust/NaruMOE-3x7B-v2-AWQ