CL-13B-Fabula

CL-13B-Fabula

CL-13B-Fabula is a fine-tuned version of Facebook's CodeLlama 13B Instruct model, specifically optimized for roleplay and general knowledge tasks while maintaining its chat understanding capabilities.
This model is basically a bigger version of L3.1-8B-Fabula, since I wanted to make something a bit bigger but my VPS storage can't handle a 70B models so 13B model it is.

Model Details

Chat Template

  • In the finetuning ChatML were used.
function chatml2(messages) {
    /**
     * @param {Array<{role: string, name: string, content: string}>} messages
     * @returns {{prompt: string, stop: string}}
     * @description Formats messages into ChatML template format
     */
    const isLastMessageAssistant = messages[messages.length - 1]?.role === "assistant";

    return {
        prompt: messages.map((message, index) => {
            const nameStr = message.name ? ` [${message.name}]` : "";
            const isLast = index === messages.length - 1;
            const needsEndTag = !isLastMessageAssistant || !isLast;
            
            return `<|im_start|>${message.role.toLowerCase()}${nameStr}\n${message.content}${needsEndTag ? "<|im_end|>" : ""}`;
        }).join("\n") + (isLastMessageAssistant ? "" : "\n<|im_start|>assistant\n"),
        stop: "<|im_end|>"
    };
}

I would highly recommend you add a set of rules in assistant role at the end of the chat history, like this example below:

<rules for="{{char}}'s responses">
1. I will write a response as {{char}} in a short manner and will keep it detailed (I will try to keep it under 300 characters).

2. Response formatting:
   "This is for talking"
   *This is for doing an action/ or self-reflection if I decide to write {{char}}'s response in first-person*
   ex: "Hello, there!" *{name} waves,* "How are you doing today?"

3. When I feel like it is needed for {{user}} to talk, I will not act as {{user}} or for them, I will simply stop generating more text via executing my EOS (end-of-string) token "<|im_end|>", to let the user write their response as {{user}}

4. I will use my past messages as an example of how {{char}} speaks
</rules>
**{{char}}'s response:**

Downloads last month
16
Safetensors
Model size
13B params
Tensor type
BF16
·
Inference Examples
Unable to determine this model's library. Check the docs .

Model tree for BusRune/CL-13B-Fabula

Finetuned
(29)
this model

Datasets used to train BusRune/CL-13B-Fabula