Update README.md
Browse files
README.md
CHANGED
@@ -5,11 +5,10 @@ model-index:
|
|
5 |
- name: out
|
6 |
results: []
|
7 |
---
|
|
|
8 |
|
9 |
-
|
10 |
-
should probably proofread and complete it, then remove this comment. -->
|
11 |
|
12 |
-
[<img src="https://raw.githubusercontent.com/OpenAccess-AI-Collective/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/OpenAccess-AI-Collective/axolotl)
|
13 |
<details><summary>See axolotl config</summary>
|
14 |
|
15 |
axolotl version: `0.3.0`
|
@@ -87,25 +86,9 @@ special_tokens:
|
|
87 |
|
88 |
</details><br>
|
89 |
|
90 |
-
# out
|
91 |
|
92 |
-
|
93 |
-
|
94 |
-
- Loss: 7.4061
|
95 |
-
|
96 |
-
## Model description
|
97 |
-
|
98 |
-
More information needed
|
99 |
-
|
100 |
-
## Intended uses & limitations
|
101 |
-
|
102 |
-
More information needed
|
103 |
-
|
104 |
-
## Training and evaluation data
|
105 |
-
|
106 |
-
More information needed
|
107 |
-
|
108 |
-
## Training procedure
|
109 |
|
110 |
### Training hyperparameters
|
111 |
|
|
|
5 |
- name: out
|
6 |
results: []
|
7 |
---
|
8 |
+
### This is the Instruction Fine Tuned version of [Tiny Llama](https://github.com/jzhang38/TinyLlama) on [@Teknium1's](https://twitter.com/Teknium1) [openhermes](https://huggingface.co/datasets/teknium/openhermes) dataset.
|
9 |
|
10 |
+
`"The TinyLlama project aims to pretrain a 1.1B Llama model on 3 trillion tokens. With some proper optimization, we can achieve this within a span of "just" 90 days using 16 A100-40G GPUs 🚀🚀. The training has started on 2023-09-01."`
|
|
|
11 |
|
|
|
12 |
<details><summary>See axolotl config</summary>
|
13 |
|
14 |
axolotl version: `0.3.0`
|
|
|
86 |
|
87 |
</details><br>
|
88 |
|
|
|
89 |
|
90 |
+
The loss for the 3T checkpoint explodes for some reason
|
91 |
+
![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/644bf6ef778ecbfb977e8e84/06bfkeS7cPoHxkeIHe5M7.jpeg)
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
92 |
|
93 |
### Training hyperparameters
|
94 |
|