Update README.md
Browse files
README.md
CHANGED
@@ -13,6 +13,9 @@ widget:
|
|
13 |
### Model Card: TinyStoriesChinese-110M
|
14 |
|
15 |
**Overview:**
|
|
|
|
|
|
|
16 |
TinyStoriesChinese-110M is a charmingly small, toy-like language model adept at generating short, straightforward stories in Chinese. This Small Language Model (SLM) is designed to process and create text with the simplicity and innocence of children’s tales. Despite its small size, TinyStoriesChinese-110M can generate consistent stories with robust and cute grammar, showing an emerging understanding of basic concepts such as personal hygiene and illness.
|
17 |
|
18 |
Inspired by the TinyStories research, which explores the effectiveness of small language models with simplified training material, TinyStoriesChinese-110M focuses on a very narrow task. The model uses a synthetic dataset composed of stories that even a three-year-old could understand. This approach highlights the potential of smaller models to produce coherent, consistent text without the need for extensive computational resources.
|
|
|
13 |
### Model Card: TinyStoriesChinese-110M
|
14 |
|
15 |
**Overview:**
|
16 |
+
|
17 |
+
![alt text](README.files/79e6f31072d75ef82135302dd88859a.png)
|
18 |
+
|
19 |
TinyStoriesChinese-110M is a charmingly small, toy-like language model adept at generating short, straightforward stories in Chinese. This Small Language Model (SLM) is designed to process and create text with the simplicity and innocence of children’s tales. Despite its small size, TinyStoriesChinese-110M can generate consistent stories with robust and cute grammar, showing an emerging understanding of basic concepts such as personal hygiene and illness.
|
20 |
|
21 |
Inspired by the TinyStories research, which explores the effectiveness of small language models with simplified training material, TinyStoriesChinese-110M focuses on a very narrow task. The model uses a synthetic dataset composed of stories that even a three-year-old could understand. This approach highlights the potential of smaller models to produce coherent, consistent text without the need for extensive computational resources.
|