Homunculus / README.md
Crystalcareai's picture
Update README.md
23256f7 verified
|
raw
history blame
988 Bytes
metadata
language:
  - en
license: apache-2.0
library_name: transformers
base_model:
  - mistralai/Mistral-Nemo-Base-2407
  - Qwen/Qwen3-235B-A22B

Homunculus Logo

Overview

Arcee Homunculus is a 12B parameter model developed by Arcee.ai, based on the Mistral Nemo architecture.

It was produced by distilling of Qwen3 235B logits onto Mistral Nemo after tokenizer replacement.

Like Qwen3, it features both thinking and non-thinking modes. As suggested by the name, it is a weird little guy produced through alchemy.

Homunculus is a surprisingly powerful model for its size, showing strong performance in real-world applications despite being able to fit on consumer GPUs.

Basic use


# Use a pipeline as a high-level helper

from transformers import pipeline

pipe = pipeline("text-generation", model="arcee-ai/Homunculus")
messages = \[
{"role": "user", "content": "Who are you?"},
]
pipe(messages)