metadata
language:
- en
- da
tags:
- text-generation-inference
- transformers
- unsloth
- mistral
- trl
base_model: Mabeck/Heidrun-Mistral-7B-base
datasets:
- oscar
- Mabeck/danish-OpenHermes
- kobprof/skolegpt-instruct

Model description
Heidrun-Mistral-7B-chat is a chat-model based on Heidrun-Mistral-7B-base, finetuned on danish-OpenHermes and skoleGPT for a instruction/chat format.
Datasets
This model is trained on Danish instruction datasets, which have not been safeguarded or alligned.
Most of the data has been machine-translated and may contain incorrect responses.
Samples
This model uses the ChatML format. Using other formats will severely degrade the models performance.
Uploaded model
- Developed by: Mabeck
- Finetuned from model : Mabeck/Heidrun-Mistral-7B-base
This mistral model was trained 2x faster with Unsloth and Huggingface's TRL library.