rasyosef commited on
Commit
98ad696
1 Parent(s): 3bb0848

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +24 -10
README.md CHANGED
@@ -1,21 +1,35 @@
1
  ---
2
  widget:
3
- - text: አዲስ አበባ
4
- example_title: Example 1
5
- - text: በ ኢንግሊዝ ፕሪምየር ሊግ
6
- example_title: Example 2
7
- - text: ፕሬዚዳንት ዶናልድ ትራምፕ
8
- example_title: Example 3
 
 
 
 
 
 
9
  ---
10
 
11
  # gpt2-small-amharic-128-v3
12
 
13
- This is a smaller version of the [gpt2](https://huggingface.co/openai-community/gpt2) decoder transformer model pretrained from scratch for **1.5 days** on **290 million tokens** of **Amharic** text. The **context size** of this model is **128** tokens. It has the same tokenizer as gpt2, trained from scratch using the same dataset with a vocabulary size of **16384**.
14
 
15
- This is a base model and hasn't undergone any supervised finetuing yet.
 
 
 
 
 
 
 
 
16
 
17
  ### Demo
18
 
19
- You can use the following demo to generate text using this model. Please **enter a prompt** and click the **Generate** button to generate completions for the prompt.
20
 
21
- https://huggingface.co/spaces/rasyosef/GPT2-Amharic
 
1
  ---
2
  widget:
3
+ - text: አዲስ አበባ
4
+ example_title: Example 1
5
+ - text: በ ኢንግሊዝ ፕሪምየር ሊግ
6
+ example_title: Example 2
7
+ - text: ፕሬዚዳንት ዶናልድ ትራምፕ
8
+ example_title: Example 3
9
+ language:
10
+ - am
11
+ metrics:
12
+ - perplexity
13
+ library_name: transformers
14
+ pipeline_tag: text-generation
15
  ---
16
 
17
  # gpt2-small-amharic-128-v3
18
 
19
+ This is a smaller version of the [gpt2](https://huggingface.co/openai-community/gpt2) decoder transformer model pretrained from scratch for **1.5 days** on **290 million tokens** of **Amharic** text.
20
 
21
+ - It has **33.7 Million parameters**
22
+ - The **context size** of this model is **128** tokens.
23
+ - It has the same **tokenizer** as gpt2, trained from scratch using the same dataset with a vocabulary size of **16384**.
24
+ - This is a base model and hasn't undergone any supervised finetuing yet.
25
+
26
+ It achieves the following results on the evaluation set:
27
+
28
+ - `Loss: 3.96`
29
+ - `Perplexity: 52.55`
30
 
31
  ### Demo
32
 
33
+ You can use the following demo to generate text using gpt2-small-amharic. Please **enter a prompt** and click the **Generate** button to generate completions for the prompt.
34
 
35
+ https://huggingface.co/spaces/rasyosef/GPT2-Amharic