gemma-2-2b-bg / README.md
petkopetkov's picture
Update README.md
9cd74dd verified
metadata
base_model: unsloth/gemma-2-2b-bnb-4bit
language:
  - en
  - bg
license: apache-2.0
tags:
  - text-generation-inference
  - transformers
  - unsloth
  - gemma2
  - trl
datasets:
  - petkopetkov/math_qa-bg
  - petkopetkov/gsm8k-bg
  - petkopetkov/winogrande_xl-bg
  - petkopetkov/hellaswag-bg
  - petkopetkov/mmlu-bg
  - petkopetkov/arc-easy-bg
  - petkopetkov/arc-challenge-bg

Gemma-2-2B-Bulgarian

  • Developed by: petkopetkov
  • License: apache-2.0
  • Finetuned from model : unsloth/gemma-2-2b-bnb-4bit

Gemma-2-2B finetuned on datasets translated to Bulgarian language:

  • MMLU: multiple-choice questions from various branches of knowledge
  • Winogrande challenge: testing world knowledge and understanding
  • Hellaswag: testing sentence completion
  • ARC Easy/Challenge: testing logical reasoning
  • GSM-8k: solving multiple-choice questions in high-school mathematics
  • MathQA: math word problems

Usage

First, install the Transformers library with:

pip install -U transformers

Run with the pipeline API

import torch
from transformers import pipeline

pipe = pipeline(
    "text-generation",
    model="petkopetkov/gemma-2-2b-bg",
    torch_dtype=torch.bfloat16, 
    device_map="auto"
)

prompt = "Колко е 2 + 2?"

print(pipe(prompt)[0]['generated_text'])