Llama3.2-1B-Bulgarian
- Developed by: petkopetkov
- License: apache-2.0
- Finetuned from model : unsloth/llama-3.2-1b-bnb-4bit
Llama3.2-1B finetuned on datasets translated to Bulgarian language:
- MMLU: multiple-choice questions from various branches of knowledge
- Winogrande challenge: testing world knowledge and understanding
- Hellaswag: testing sentence completion
- ARC Easy/Challenge: testing logical reasoning
- GSM-8k: solving multiple-choice questions in high-school mathematics
- MathQA: math word problems
Usage
First, install the Transformers library with:
pip install -U transformers
Run with the pipeline
API
import torch
from transformers import pipeline
pipe = pipeline(
"text-generation",
model="petkopetkov/Llama3.2-1B-bg",
torch_dtype=torch.bfloat16,
device_map="auto"
)
prompt = "Колко е 2 + 2?"
print(pipe(prompt)[0]['generated_text'])
- Downloads last month
- 215
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.