File size: 586 Bytes
8393d84 c770c4c 65712d1 8393d84 9fc638e 2dfc265 33c76a2 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 |
---
base_model:
- rAIfle/Acolyte-22B
quantized_by: Brioch
base_model_relation: quantized
pipeline_tag: text-generation
---
6.5 bpw EXL2 quant of [Acolyte-22B](https://huggingface.co/rAIfle/Acolyte-22B)
---
# Acolyte-22B

LoRA of a bunch of random datasets on top of Mistral-Small-Instruct-2409, then SLERPed onto base at 0.5. Decent enough for its size.
Check the [LoRA](https://huggingface.co/rAIfle/Acolyte-LORA) for dataset info.
Use `Mistral V2 & V3` template. |