metadata
language: id
widget:
- text: Wahai rembulan yang tertutup awan hujan
Indonesian GPT-2 finetuned on Indonesian poems
This is the Indonesian gpt2-small model fine-tuned to Indonesian poems. The dataset can be found in here All training was done on Google Colab Jupyter Notebook (soon).
The dataset is splitted into two subset with details belows:
split | count (examples) | percentage |
---|---|---|
train | 7,358 | 80% |
validation | 1,890 | 20% |
Evaluation results
The model evaluation results after 10 epochs are as follows:
dataset | train/loss | eval/loss | eval perplexity |
---|---|---|---|
id puisi | 3.324700 | 3.502665 | 33.20 |
The logs can be found in wandb page here