6.5 bpw EXL2 quant of Acolyte-22B


Acolyte-22B

image/png

LoRA of a bunch of random datasets on top of Mistral-Small-Instruct-2409, then SLERPed onto base at 0.5. Decent enough for its size. Check the LoRA for dataset info.

Use Mistral V2 & V3 template.

Downloads last month
14
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.

Model tree for Brioch/Acolyte-22B-6.5bpw-exl2

Quantized
(7)
this model