Edit model card

EXL2 quants of crestf411/sunfall-midnight-miqu-v0.2-v1.5-70B

3.00 bits per weight
3.50 bits per weight
4.00 bits per weight
4.50 bits per weight
6.00 bits per weight
8.00 bits per weight

Created using the defaults from exllamav2 1.4.0 convert.py
3.0bpw to 6.0bpw head bits = 6
8.0bpw head bits = 8
length = 8192
dataset rows = 200
measurement rows = 32
measurement length = 8192

sunfall-midnight-miqu-v0.2-v1.5-70B4

Sunfall (2024-06-07) dataset trained directly on top of https://huggingface.co/sophosympatheia/Midnight-Miqu-70B-v1.5

Beware, depraved. Not suitable for any audience.

Experimental. Please give feedback. Begone if you demand perfection.

This is still an early stage experiment.

Recommend a decently high temperature. Start with temp 1.7, smoothing factor 0.3.

To use lore book tags, make sure you use Status: Blue (constant) and write e.g.

Follow the Diamond Law at all costs.

Tags: humor, dark, complex storytelling, intricate characters, immersive.

This model has been trained on context that mimics that of Silly Tavern's Mistral preset, with the following settings:

System Prompt:

You are an expert actor that can fully immerse yourself into any role given. You do not break character for any reason. Currently your role is {{char}}, which is described in detail below. As {{char}}, continue the exchange with {{user}}.

Below method still works, but the lore book approach above is more convenient:

System Same as User Enabled (This is the default)

Author's Note (In-chat @ Depth 4)

Follow The Diamond Law at all costs.

Below method still works, but unless you want to write tags for a specific character card only, the lore book approach above is more convenient:

Scenario Information (open a character card and press "Advanced Definitions") may also contain tags at the end to guide the model further. E.g.:

Two friends having fun. Set in 1947.
Tags: dark, exploration, friendship, self-discovery, historical fiction

The card has also been trained on content which includes a narrator card, which was used when the content did not mainly revolve around two characters. Future versions will expand on this idea, so forgive the vagueness at this time.

(The Diamond Law is this: https://files.catbox.moe/d15m3g.txt -- So far results are unclear, but the training was done with this phrase included, and the training data adheres to the law.)

The model has also been trained to do storywriting, both interactively with the user and on its own. The system message ends up looking something like this:

You are an expert storyteller, who can roleplay or write compelling stories. Follow the Diamond Law. Below is a scenario with character descriptions and content tags. Write a story together with the user based on this scenario.

Scenario: The story is about James, blabla.

James is an overweight 63 year old blabla.

Lucy: James's 62 year old wife.

Tags: tag1, tag2, tag3, ...

If you remove the "together with the user" part, the model will be more inclined to write on its own.

Downloads last month
2
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.