BokantLM Logo

BokantLM 0.1–0.5B

BokantLM – "Small but Supreme in Its Domain"

BokantLM is not a general-purpose model that tries to do everything well.
Instead, it is an ultra-lightweight LLM designed to focus on a single domain, delivering the highest possible efficiency and performance in that area.


Overview


Philosophy

While most LLMs aim for versatility by learning across many fields,
BokantLM is built to achieve top efficiency and performance within a specific domain.

This 0.1–0.5B release is specialized in coding and algorithm problem solving,
with a particular focus on LeetCode-style challenges.


The reason I created this model

I created this model based on the idea that if I focus intensively on learning only Python , even a small model could become very good at Python programming.


Future Plans

  • βœ… Coding(Python)-specialized model release (current version)
  • πŸ”„ Mathematics problem-solving specialized version
  • πŸ”„ Domain-specific ultra-lightweight models for law, medicine, science, etc.
  • πŸ”„ Attempt at applying large LLM knowledge distillation

Downloads last month
34
Safetensors
Model size
494M params
Tensor type
F32
Β·
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for llaa33219/BokantLM0.1-0.5B

Base model

Qwen/Qwen2.5-0.5B
Finetuned
(372)
this model
Quantizations
1 model

Dataset used to train llaa33219/BokantLM0.1-0.5B

Space using llaa33219/BokantLM0.1-0.5B 1