--- base_model: v000000/MysticGem-v1.3-Guanaco-L2-13B library_name: transformers tags: - mergekit - merge - llama-cpp - llama --- experiment manticore guanaco is good still, so trying the same with llama 2 I guess ```yaml models: - model: v000000/MysticGem-v1.3-L2-13B+Mikael110/llama-2-13b-guanaco-qlora parameters: weight: 1.0 merge_method: linear dtype: bfloat16 ``` ### Prompt Format (Metharme): ```bash <|system|>Take the role of {{char}} in a play where you leave a lasting impression on {{user}}. Never skip or gloss over {{char}}'s actions. <|user|>{{user}}: {prompt}<|model|>{{char}}: {output} ``` ### Prompt Format (Alpaca): ```bash Below is an instruction that describes a task. Write a response that appropriately completes the request. ### Instruction: Take the role of {{char}} in a play where you leave a lasting impression on {{user}}. Never skip or gloss over {{char}}'s actions. ### Instruction: {prompt} ### Response: {output} ``` +vicuna/guanaco