--- datasets: - Squish42/bluemoon-fandom-1-1-rp-cleaned - OpenLeecher/Teatime - PygmalionAI/PIPPA tags: - not-for-all-audiences - nsfw license: cc-by-nc-4.0 --- ## What is PetrolLM-Claude-Chat? PetrolLM-Claude-Chat is the [CollectiveCognition-v1.1-Mistral-7B](https://huggingface.co/teknium/CollectiveCognition-v1.1-Mistral-7B) model with the [PetrolLoRA](https://huggingface.co/Norquinal/PetrolLoRA) applied. The dataset (for the LoRA) consists of 2800 samples, with the composition as follows: * AICG Logs (~34%) * PygmalionAI/PIPPA (~33%) * Squish42/bluemoon-fandom-1-1-rp-cleaned (~29%) * OpenLeecher/Teatime (~4%) These samples were then back-filled using gpt-4/gpt-3.5-turbo-16k or otherwise converted to fit the prompt format. ## Prompt Format The model uses the following prompt format: ``` --- style: roleplay characters: [char]: [description] summary: [scenario] --- Format: [char]: [message] Human: [message] ``` ## Use in Text Generation Web UI Install the bleeding-edge version of `transformers` from source: ``` pip install git+https://github.com/huggingface/transformers ``` Or, alternatively, change `model_type` in `config.json` from `mistral` to `llama`. ## Use in SillyTavern UI ![](https://files.catbox.moe/2dkr28.png) As an addendum, you can include one of the following as the `Last Output Sequence`: ``` Human: In your next reply, write at least two paragraphs. Be descriptive and immersive, providing vivid details about {{char}}'s actions, emotions, and the environment. {{char}}: ``` ``` {{char}} (2 paragraphs, engaging, natural, authentic, descriptive, creative): ``` ``` [System note: Write at least two paragraphs. Be descriptive and immersive, providing vivid details about {{char}}'s actions, emotions, and the environment.] {{char}}: ``` The third one seems to work the best. I would recommend experimenting with creating your own to best suit your needs.