|
--- |
|
license: apache-2.0 |
|
tags: |
|
- Solar |
|
- Mistral |
|
- Roleplay |
|
--- |
|
|
|
![image/png](https://cdn-uploads.huggingface.co/production/uploads/64bb1109aaccfd28b023bcec/bjOB_8BsqVteKKxARPc13.png) |
|
|
|
### Summary |
|
|
|
I did a more aggressive gradient SLERP on the base model with a medical lora added to Frostwind preserving more of the prose model and the prose seemed better without degredation in coherency. So then I redid the dares ties merge of it, with it's parent models, but with a slightly lower weight to FrostWind. |
|
|
|
Prose is a bit stronger, coherency slightly weaker with this variant. It's excellent if you are prepared to regen. Really happy with both these models. |
|
|
|
### Recipe |
|
|
|
Formula for the DARE TIES merge is as follows (FrostMed is Frostwind with Solarmed at 0.15 weight): |
|
|
|
- model: ./Frostmaid |
|
parameters: |
|
density: [0.45] # density gradient |
|
weight: 0.23 |
|
- model: ./FrostMed |
|
parameters: |
|
density: [0.35] # density gradient |
|
weight: 0.18 |
|
- model: ./SnowLotus-10.7B-v2 |
|
parameters: |
|
density: [1] # density gradient |
|
weight: 1 |
|
|
|
|
|
Resources used: |
|
|
|
https://huggingface.co/BlueNipples/SnowLotus-v2-10.7B |
|
|
|
https://huggingface.co/NyxKrage/FrostMaid-10.7B-TESTING-pt |
|
|
|
https://huggingface.co/Sao10K/Frostwind-10.7B-v1 |
|
|
|
https://huggingface.co/NyxKrage/Solar-Doc-10.7B-Lora |
|
|
|
https://github.com/cg123/mergekit/tree/main |