Update README.md
Browse files
README.md
CHANGED
@@ -11,14 +11,14 @@ tags:
|
|
11 |
|
12 |
## 13B-Chimera
|
13 |
|
14 |
-
|
15 |
[] = applied as LoRA to a composite model | () = combined as composite models
|
16 |
|
17 |
((MantiCore3E+VicunaCocktail)+[SuperCOT+[StorytellingV2+(SuperHOTProtoType-8192ctx+Metharme)]])
|
18 |
|
19 |
This model is the result of an experimental use of LoRAs on language models and model merges that are not the base HuggingFace-format LLaMA model they were intended for.
|
20 |
|
21 |
-
|
22 |
|
23 |
Outcomes of applying Lower Order Rank Adapters in unconventional ways.
|
24 |
Determine if applying LoRAs and stacking LoRAs onto merged models bypasses the zero-sum result of weight-sum model merging.
|
@@ -31,7 +31,7 @@ Releasing Chimera as-is; Alpaca instruct verified working, Vicuna instruct forma
|
|
31 |
If using KoboldAI or Text-Generation-WebUI, recommend switching between Godlike and Storywriter presets and adjusting output length + instructions in memory.
|
32 |
Other presets as well as custom settings can yield highly different results as well as Temperature. If poking it with a stick doesn't work try another stick.
|
33 |
|
34 |
-
Language Models and LoRAs Used Credits:
|
35 |
|
36 |
manticore-13b [Epoch3] by openaccess-ai-collective
|
37 |
|
|
|
11 |
|
12 |
## 13B-Chimera
|
13 |
|
14 |
+
## Composition:
|
15 |
[] = applied as LoRA to a composite model | () = combined as composite models
|
16 |
|
17 |
((MantiCore3E+VicunaCocktail)+[SuperCOT+[StorytellingV2+(SuperHOTProtoType-8192ctx+Metharme)]])
|
18 |
|
19 |
This model is the result of an experimental use of LoRAs on language models and model merges that are not the base HuggingFace-format LLaMA model they were intended for.
|
20 |
|
21 |
+
## Experiment Purpose:
|
22 |
|
23 |
Outcomes of applying Lower Order Rank Adapters in unconventional ways.
|
24 |
Determine if applying LoRAs and stacking LoRAs onto merged models bypasses the zero-sum result of weight-sum model merging.
|
|
|
31 |
If using KoboldAI or Text-Generation-WebUI, recommend switching between Godlike and Storywriter presets and adjusting output length + instructions in memory.
|
32 |
Other presets as well as custom settings can yield highly different results as well as Temperature. If poking it with a stick doesn't work try another stick.
|
33 |
|
34 |
+
## Language Models and LoRAs Used Credits:
|
35 |
|
36 |
manticore-13b [Epoch3] by openaccess-ai-collective
|
37 |
|