PocketDoc's picture
Update README.md
be580c6
|
raw
history blame
915 Bytes
metadata
license: agpl-3.0

This is my laziest model card to date, its a merge of about 70% Llama-2-chat, 15% Holodeck, and 15% Chronos.

My goals with this were to break induce some prose and novel structuring into Llama-2-chat without losing its magic. I think it worked? idk this probably wont be the last of this series.

Holodeck and Chronos were merged 50/50 into 'HoloChronos' then layer only merged with the following pattern into Llama-2-chat with 'HoloChronos' being the #2 model

[0.2, 0.2, 0.2, 0.2, 0.3, 0.4, 0.3, 0.2, 0.1, 0.3]

I recommend using the Llama-2-chat prompt format but model merges are unholy so YMMV.

Thank you to Gryphe for his merge script https://github.com/Gryphe/BlockMerge_Gradient/tree/main

Thank you to Mr. Seeker and Elinas for the Holodeck and ChronosV2 models respectively.

https://huggingface.co/KoboldAI/LLAMA2-13B-Holodeck-1

https://huggingface.co/elinas/chronos-13b-v2