MarsupialAI
commited on
Commit
•
99a9d59
1
Parent(s):
09d625e
Update README.md
Browse files
README.md
CHANGED
@@ -13,16 +13,23 @@ tags:
|
|
13 |
|
14 |
![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/65a531bc7ec6af0f95c707b1/QFmPxADHAqMf3Wb_Xt1ry.jpeg)
|
15 |
|
16 |
-
This model is a rotating-stack merge of three 70b models
|
17 |
|
18 |
-
Component models for
|
19 |
- royallab/Aetheria-L2-70B
|
20 |
- lizpreciatior/lzlv_70b_fp16_hf
|
21 |
- Sao10K/WinterGoddess-1.4x-70B-L2
|
22 |
|
23 |
-
Components of those models include Nous-Hermes-Llama2-70b, Xwin-LM-7B-V0.1, Mythospice-70b, Euryale-1.3-L2-70B, tulu-2-dpo-70b, GOAT-70B-Storytelling, Platypus2-70B-instruct, Lila-70B, SunsetBoulevard, and some private LoRAs.
|
24 |
|
25 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
26 |
|
27 |
# WTF is a rotating-stack merge?
|
28 |
|
|
|
13 |
|
14 |
![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/65a531bc7ec6af0f95c707b1/QFmPxADHAqMf3Wb_Xt1ry.jpeg)
|
15 |
|
16 |
+
This model is a rotating-stack merge of three 70b models in a 103b (120 layer) configuration inspired by Venus 103b. The result is a large model that contains a little bit of everything - including the kitchen sink.
|
17 |
|
18 |
+
Component models for the rotating stack are
|
19 |
- royallab/Aetheria-L2-70B
|
20 |
- lizpreciatior/lzlv_70b_fp16_hf
|
21 |
- Sao10K/WinterGoddess-1.4x-70B-L2
|
22 |
|
23 |
+
Components of those models are purported to include: Nous-Hermes-Llama2-70b, Xwin-LM-7B-V0.1, Mythospice-70b, Euryale-1.3-L2-70B, tulu-2-dpo-70b, GOAT-70B-Storytelling, Platypus2-70B-instruct, Lila-70B, SunsetBoulevard, and some private LoRAs.
|
24 |
|
25 |
|
26 |
+
# Sample output
|
27 |
+
|
28 |
+
Storywriting
|
29 |
+
```
|
30 |
+
Sample goes here
|
31 |
+
```
|
32 |
+
|
33 |
|
34 |
# WTF is a rotating-stack merge?
|
35 |
|