ajibawa-2023 commited on
Commit
f5a9680
1 Parent(s): 66cc1ba

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +72 -0
README.md CHANGED
@@ -1,3 +1,75 @@
1
  ---
2
  license: apache-2.0
 
 
 
 
 
 
 
 
 
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: apache-2.0
3
+ datasets:
4
+ - ajibawa-2023/Children-Stories-Collection
5
+ language:
6
+ - en
7
+ tags:
8
+ - story
9
+ - young children
10
+ - educational
11
+ - knowledge
12
  ---
13
+
14
+
15
+ **Young-Children-Storyteller-Mistral-7B**
16
+
17
+ This model is based on my dataset [Children-Stories-Collection](https://huggingface.co/datasets/ajibawa-2023/Children-Stories-Collection) which has over 0.9 million stories meant for Young Children (age 6 to 12).
18
+
19
+ Drawing upon synthetic datasets meticulously designed with the developmental needs of young children in mind, Young-Children-Storyteller is more than just a tool—it's a companion on the journey of discovery and learning.
20
+ With its boundless storytelling capabilities, this model serves as a gateway to a universe brimming with wonder, adventure, and endless possibilities.
21
+
22
+ Whether it's embarking on a whimsical adventure with colorful characters, unraveling mysteries in far-off lands, or simply sharing moments of joy and laughter, Young-Children-Storyteller fosters a love for language and storytelling from the earliest of ages.
23
+ Through interactive engagement and age-appropriate content, it nurtures creativity, empathy, and critical thinking skills, laying a foundation for lifelong learning and exploration.
24
+
25
+ Rooted in a vast repository of over 0.9 million specially curated stories tailored for young minds, Young-Children-Storyteller is poised to revolutionize the way children engage with language and storytelling.
26
+
27
+ Kindly note this is qLoRA version, another exception.
28
+
29
+
30
+ **GGUF & Exllama**
31
+
32
+ GGUF: TBA
33
+
34
+ Exllama: TBA
35
+
36
+ **Training**
37
+
38
+ Entire dataset was trained on 4 x A100 80GB. For 3 epoch, training took more than 30 Hours. Axolotl codebase was used for training purpose. Entire data is trained on Mistral-7B-v0.1.
39
+
40
+ **Example Prompt:**
41
+
42
+ This model uses **ChatML** prompt format.
43
+
44
+ ```
45
+ <|im_start|>system
46
+ You are a Helpful Assistant who can write educational stories for Young Children.<|im_end|>
47
+ <|im_start|>user
48
+ {prompt}<|im_end|>
49
+ <|im_start|>assistant
50
+
51
+ ```
52
+ You can modify above Prompt as per your requirement.
53
+
54
+
55
+ I want to say special Thanks to the Open Source community for helping & guiding me to better understand the AI/Model development.
56
+
57
+ Thank you for your love & support.
58
+
59
+ **Example Output**
60
+
61
+ Example 1
62
+
63
+
64
+ ![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/64aea8ff67511bd3d965697b/H2FucX0CTtV25wlgHmifN.jpeg)
65
+
66
+ Example 2
67
+
68
+
69
+ ![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/64aea8ff67511bd3d965697b/o7hiMI5noO8fPedUG75H8.jpeg)
70
+
71
+ Example 3
72
+
73
+
74
+ ![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/64aea8ff67511bd3d965697b/J48WYa1qmKnRaILA_44Ao.jpeg)
75
+