IndoChat-Tiny

This model is a bilingual GPT2 model fine-tuned with instructions dataset (~100K English instructions and its ~100K Indonesian translation). The base model was a GPT2-Medium (345M params) which was pretrained with 75GB of Indonesian (99%) and English (1%) dataset.

Downloads last month
19
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.