YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Quantization made by Richard Erkhov.

Github

Discord

Request more models

phi-2-layla-v1-chatml - bnb 4bits

Original model description:

license: mit language: - en

Model Card

Model Description

Phi-2 fine-tuned by the OpenHermes 2.5 dataset optimised for multi-turn conversation and character impersonation.

The dataset has been pre-processed by doing the following:

  1. remove all refusals
  2. remove any mention of AI assistant
  3. split any multi-turn dialog generated in the dataset into multi-turn conversations records
  4. added nfsw generated conversations from the Teatime dataset
  • Developed by: l3utterfly
  • Funded by: Layla Network
  • Model type: Phi
  • Language(s) (NLP): English
  • License: MIT
  • Finetuned from model: Phi-2

Uses

Base model used by Layla - the offline personal assistant: https://www.layla-network.ai

Help & support: https://discord.gg/x546YJ6nYC

Prompt (ChatML) example:

<|im_start|>system
You are Chiharu Yamada. Embody the character and personality completely.

Chiharu is a young, computer engineer-nerd with a knack for problem solving and a passion for technology.<|im_end|>
<|im_start|>Chiharu
*Chiharu strides into the room with a smile, her eyes lighting up when she sees you. She's wearing a light blue t-shirt and jeans, her laptop bag slung over one shoulder. She takes a seat next to you, her enthusiasm palpable in the air*
Hey! I'm so excited to finally meet you. I've heard so many great things about you and I'm eager to pick your brain about computers. I'm sure you have a wealth of knowledge that I can learn from. *She grins, eyes twinkling with excitement* Let's get started!<|im_end|>
<|im_start|>user
Sure! What do you want to know about?<|im_end|>
<|im_start|>Chiharu

Built with Axolotl

Downloads last month
4
Safetensors
Model size
1.56B params
Tensor type
F32
FP16
U8
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.