Edit model card

IndoChat-Tiny

This model is a bilingual GPT2 model fine-tuned with instructions dataset (~100K English instructions and its ~100K Indonesian translation). The base model was a GPT2-Medium (345M params) which was pretrained with 75GB of Indonesian (99%) and English (1%) dataset.

Downloads last month
14
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.