MarsupialAI commited on
Commit
b042989
1 Parent(s): 3107e0f

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -1
README.md CHANGED
@@ -10,7 +10,8 @@ tags:
10
  ---
11
  # Kitchen Sink 103b
12
 
13
- ![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/65a531bc7ec6af0f95c707b1/KsbIQfRNIMQhlwbbV6gLX.jpeg)
 
14
 
15
  This model is a rotating-stack merge of three 70b models, each of which is a merge of multiple merges and finetunes. The result is a large model that contains a little bit of everything - including the kitchen sink.
16
 
@@ -22,6 +23,7 @@ Component models for this stack are
22
  Components of those models include Nous-Hermes-Llama2-70b, Xwin-LM-7B-V0.1, Mythospice-70b, Euryale-1.3-L2-70B, tulu-2-dpo-70b, GOAT-70B-Storytelling, Platypus2-70B-instruct, Lila-70B, SunsetBoulevard, and some private LoRAs.
23
 
24
 
 
25
  # WTF is a rotating-stack merge?
26
 
27
  Jeb Carter found that the performance of a stacked merge could be significantly improved by reversing the model order in the stack, and then doing a linear merge between the original and reversed stacks. That is what I did here, creating three stacked merges with the three source models, and then doing a 1:1:1 linear merge of all three stacks. The exact merge configs can be found in the recipe.txt file.
 
10
  ---
11
  # Kitchen Sink 103b
12
 
13
+
14
+ ![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/65a531bc7ec6af0f95c707b1/QFmPxADHAqMf3Wb_Xt1ry.jpeg)
15
 
16
  This model is a rotating-stack merge of three 70b models, each of which is a merge of multiple merges and finetunes. The result is a large model that contains a little bit of everything - including the kitchen sink.
17
 
 
23
  Components of those models include Nous-Hermes-Llama2-70b, Xwin-LM-7B-V0.1, Mythospice-70b, Euryale-1.3-L2-70B, tulu-2-dpo-70b, GOAT-70B-Storytelling, Platypus2-70B-instruct, Lila-70B, SunsetBoulevard, and some private LoRAs.
24
 
25
 
26
+
27
  # WTF is a rotating-stack merge?
28
 
29
  Jeb Carter found that the performance of a stacked merge could be significantly improved by reversing the model order in the stack, and then doing a linear merge between the original and reversed stacks. That is what I did here, creating three stacked merges with the three source models, and then doing a 1:1:1 linear merge of all three stacks. The exact merge configs can be found in the recipe.txt file.