MiniLingua-1b-Instruct

MiniLingua-1b-Instruct is an instruction-tuned multilingual model based on the MiniLingua-1b base model. It supports a diverse set of European languages and programming code, making it suitable for instruction-following, multilingual generation, and downstream tasks like question answering, summarisation etc.

Supported Languages

  • Bulgarian
  • Czech
  • Dutch
  • English
  • Finnish
  • French
  • German
  • Greek
  • Italian
  • Polish
  • Portuguese
  • Spanish
  • Swedish
  • Programming code

Instruction Tuning

This preview instruction-tuned version of MiniLingua-1b was trained over 1 epoch on 1.2 million instructions from the following high-quality datasets:

The supervised fine-tuning (SFT) was performed on the Triton Aalto cluster using 4 H200 GPUs.

Intended Use

This model is a preview release intended for:

  • Multilingual instruction following
  • Evaluation and benchmarking
  • Research in low- and high-resource European languages

Limitations

  • This version is a first-stage SFT release; alignment steps is not applied.
  • Some languages may show uneven instruction-following ability depending on resource availability and instruction diversity.

License: Apache-2.0

Downloads last month
3
Safetensors
Model size
1B params
Tensor type
BF16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support