File size: 1,178 Bytes
93b00f8 98ad696 3bb0848 98ad696 3bb0848 98ad696 f652a23 3bb0848 98ad696 3bb0848 98ad696 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 |
---
widget:
- text: አዲስ አበባ
example_title: Example 1
- text: በ ኢንግሊዝ ፕሪምየር ሊግ
example_title: Example 2
- text: ፕሬዚዳንት ዶናልድ ትራምፕ
example_title: Example 3
language:
- am
metrics:
- perplexity
library_name: transformers
pipeline_tag: text-generation
---
# gpt2-small-amharic-128-v3
This is a smaller version of the [gpt2](https://huggingface.co/openai-community/gpt2) decoder transformer model pretrained from scratch for **1.5 days** on **290 million tokens** of **Amharic** text.
- It has **33.7 Million parameters**
- The **context size** of this model is **128** tokens.
- It has the same **tokenizer** as gpt2, trained from scratch using the same dataset with a vocabulary size of **16384**.
- This is a base model and hasn't undergone any supervised finetuing yet.
It achieves the following results on the evaluation set:
- `Loss: 3.99`
- `Perplexity: 54.17`
### Demo
You can use the following demo to generate text using gpt2-small-amharic. Please **enter a prompt** and click the **Generate** button to generate completions for the prompt.
https://huggingface.co/spaces/rasyosef/GPT2-Amharic
|