FLOR-6.3B-xat

FLOR-6.3B-xat és el resultat de finetunejar el model FLOR-6.3B de Projecte Aina amb les instruccions d'OpenAssistant v2 traduïdes automàticament al català amb recursos de Helsinki-NLP i tractades en format ChatML.

Prompt Template

FLOR-6.3B-xat usa ChatML com a prompt template:

<|im_start|>user
Qui va ser Isaac Newton?<|im_end|>
<|im_start|>assistant\n

Open LLM Leaderboard Evaluation Results

Detailed results can be found here

Metric Value
Avg. 38.23
AI2 Reasoning Challenge (25-Shot) 38.65
HellaSwag (10-Shot) 63.76
MMLU (5-Shot) 26.54
TruthfulQA (0-shot) 37.96
Winogrande (5-shot) 62.43
GSM8k (5-shot) 0.00
Downloads last month
5
Safetensors
Model size
6.25B params
Tensor type
FP16
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for xaviviro/FLOR-6.3B-xat

Finetuned
(3)
this model

Dataset used to train xaviviro/FLOR-6.3B-xat