ve-forbryderne
commited on
Commit
·
df2b2d1
1
Parent(s):
4fdb544
Model was trained on v3 pod, not v4 pod
Browse files
README.md
CHANGED
@@ -8,7 +8,7 @@ license: apache-2.0
|
|
8 |
This is the second generation of the original Shinen made by Mr. Seeker. The full dataset consists of 6 different sources, all surrounding the "Adult" theme. The name "Erebus" comes from the greek mythology, also named "darkness". This is in line with Shin'en, or "deep abyss". For inquiries, please contact the KoboldAI community. **Warning: THIS model is NOT suitable for use by minors. The model will output X-rated content.**
|
9 |
|
10 |
## Training procedure
|
11 |
-
GPT-NeoX-20B-Erebus was trained on a
|
12 |
|
13 |
## Training data
|
14 |
The data can be divided in 6 different datasets:
|
|
|
8 |
This is the second generation of the original Shinen made by Mr. Seeker. The full dataset consists of 6 different sources, all surrounding the "Adult" theme. The name "Erebus" comes from the greek mythology, also named "darkness". This is in line with Shin'en, or "deep abyss". For inquiries, please contact the KoboldAI community. **Warning: THIS model is NOT suitable for use by minors. The model will output X-rated content.**
|
9 |
|
10 |
## Training procedure
|
11 |
+
GPT-NeoX-20B-Erebus was trained on a TPUv3-256 TPU pod using a heavily modified version of Ben Wang's Mesh Transformer JAX library, the original version of which was used by EleutherAI to train their GPT-J-6B model. The training hyperparameters and statistics can be found [here](https://wandb.ai/ve-forbryderne/skein-20b?workspace=user-ve-forbryderne).
|
12 |
|
13 |
## Training data
|
14 |
The data can be divided in 6 different datasets:
|