|
--- |
|
base_model: |
|
- mistralai/Mistral-Nemo-Instruct-2407 |
|
- NeverSleep/Lumimaid-v0.2-12B |
|
- Undi95/LocalC-12B-e2.0 |
|
- intervitens/mini-magnum-12b-v1.1 |
|
library_name: transformers |
|
tags: |
|
- mergekit |
|
- merge |
|
|
|
--- |
|
I have no idea what I’m doing… if this causes the apocalypse someone please let me know. |
|
|
|
Lumimaid-Magnum-12B 8.0bpw h8 EXL2 |
|
|
|
Includes [measurement.json](https://huggingface.co/FuturisticVibes/Lumimaid-Magnum-12B-8.0bpw-h8-exl2/tree/measurement) file for further quantization |
|
|
|
Original Model: https://huggingface.co/Undi95/Lumimaid-Magnum-12B |
|
|
|
# Original Model Card |
|
|
|
Merge of Lumimaid and Magnum as requested by some. |
|
|
|
I used the new DELLA merge method in mergekit and added a finetune of Nemo only on Claude input, trained on 16k ctx, in the mix. |
|
|
|
# Prompt template: Mistral |
|
|
|
``` |
|
<s>[INST] {input} [/INST] {output}</s> |
|
``` |