Gemma-2-2B-Bulgarian

  • Developed by: petkopetkov
  • License: apache-2.0
  • Finetuned from model : unsloth/gemma-2-2b-bnb-4bit

Gemma-2-2B finetuned on datasets translated to Bulgarian language:

  • MMLU: multiple-choice questions from various branches of knowledge
  • Winogrande challenge: testing world knowledge and understanding
  • Hellaswag: testing sentence completion
  • ARC Easy/Challenge: testing logical reasoning
  • GSM-8k: solving multiple-choice questions in high-school mathematics
  • MathQA: math word problems

Usage

First, install the Transformers library with:

pip install -U transformers

Run with the pipeline API

import torch
from transformers import pipeline

pipe = pipeline(
    "text-generation",
    model="petkopetkov/gemma-2-2b-bg",
    torch_dtype=torch.bfloat16, 
    device_map="auto"
)

prompt = "Колко е 2 + 2?"

print(pipe(prompt)[0]['generated_text'])
Downloads last month
21
Safetensors
Model size
2.61B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for petkopetkov/gemma-2-2b-bg

Base model

google/gemma-2-2b
Finetuned
(145)
this model

Datasets used to train petkopetkov/gemma-2-2b-bg