--- base_model: unsloth/llama-3.2-1b-bnb-4bit language: - en - bg license: apache-2.0 tags: - text-generation-inference - transformers - unsloth - llama - trl datasets: - petkopetkov/math_qa-bg - petkopetkov/gsm8k-bg - petkopetkov/winogrande_xl-bg - petkopetkov/hellaswag-bg - petkopetkov/mmlu-bg - petkopetkov/arc-easy-bg - petkopetkov/arc-challenge-bg --- # Llama3.2-1B-Bulgarian-tokenizer - **Developed by:** petkopetkov - **License:** apache-2.0 - **Finetuned from model :** unsloth/llama-3.2-1b-bnb-4bit Llama3.2-1B finetuned on datasets translated to Bulgarian language (with tokenizer trained on Bulgarian text): - **MMLU**: multiple-choice questions from various branches of knowledge - **Winogrande challenge**: testing world knowledge and understanding - **Hellaswag**: testing sentence completion - **ARC Easy/Challenge**: testing logical reasoning - **GSM-8k**: solving multiple-choice questions in high-school mathematics - **MathQA**: math word problems ### Usage First, install the Transformers library with: ```sh pip install -U transformers ``` #### Run with the `pipeline` API ```python import torch from transformers import pipeline pipe = pipeline( "text-generation", model="petkopetkov/Llama3.2-1B-bg-tokenizer", torch_dtype=torch.bfloat16, device_map="auto" ) prompt = "Колко е 2 + 2?" print(pipe(prompt)[0]['generated_text']) ```