Built with Axolotl

An instruct based fine tune of mistralai/Mistral-7B-Instruct-v0.2.

It works well with long system prompts.

It isn't generic in a sense that it shouldn't be used for story telling, for example, but only for reasoning and text comprehension.

This model is trained on a private dataset. The high GSM8K score is NOT because of the MetaMath dataset.

Prompt Format (see the guidelines from the base model):

<s>[INST] {system_message} . Say "Acknowledged!" if you understood. [/INST] Acknowledged! </s> [INST] {prompt} [/INST]
Downloads last month
1,989
Safetensors
Model size
7.24B params
Tensor type
BF16
Β·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model authors have turned it off explicitly.

Model tree for Mihaiii/Metis-0.3

Finetuned
(935)
this model
Finetunes
7 models
Quantizations
5 models

Spaces using Mihaiii/Metis-0.3 6