Edit model card

QuantFactory/HelpingAI2-9B-GGUF

This is quantized version of OEvortex/HelpingAI2-9B created using llama.cpp

Original Model Card

HelpingAI2-9B : Emotionally Intelligent Conversational AI

logo

Overview

HelpingAI2-9B is a state-of-the-art large language model designed to facilitate emotionally intelligent conversations. It leverages advanced natural language processing capabilities to engage users with empathy, understanding, and supportive dialogue across a variety of topics.

  • Engage in meaningful, open-ended dialogue while displaying high emotional intelligence.
  • Recognize and validate user emotions and emotional contexts.
  • Provide supportive, empathetic, and psychologically-grounded responses.
  • Avoid insensitive, harmful, or unethical speech.
  • Continuously improve emotional awareness and dialogue skills.

Methodology

HelpingAI2-9B is part of the HelpingAI series and has been trained using:

  • Supervised Learning: Utilizing large dialogue datasets with emotional labeling to enhance empathy and emotional recognition.
  • Reinforcement Learning: Implementing a reward model that favors emotionally supportive responses to ensure beneficial interactions.
  • Constitution Training: Embedding stable and ethical objectives to guide its conversational behavior.
  • Knowledge Augmentation: Incorporating psychological resources on emotional intelligence to improve its understanding and response capabilities.

Emotional Quotient (EQ)

HelpingAI2-9B has achieved an impressive Emotional Quotient (EQ) of 95.89, making it one of the most emotionally intelligent AI models available. This EQ score reflects its advanced ability to understand and respond to human emotions in a supportive and empathetic manner.

Usage Code

import torch
from transformers import AutoModelForCausalLM, AutoTokenizer

# Load the HelpingAI2-9B  model
model = AutoModelForCausalLM.from_pretrained("OEvortex/HelpingAI2-9B", trust_remote_code=True)
# Load the tokenizer
tokenizer = AutoTokenizer.from_pretrained("OEvortex/HelpingAI2-9B", trust_remote_code=True)


# Define the chat input
chat = [
    { "role": "system", "content": "You are HelpingAI, an emotional AI. Always answer my questions in the HelpingAI style." },
    { "role": "user", "content": "GIVE ME YOUR INTRO" }
]

inputs = tokenizer.apply_chat_template(
    chat,
    add_generation_prompt=True,
    return_tensors="pt"
).to(model.device)


# Generate text
outputs = model.generate(
    inputs,
    max_new_tokens=256,
    do_sample=True,
    temperature=0.6,
    top_p=0.9,
    eos_token_id=tokenizer.eos_token_id, 
)


response = outputs[0][inputs.shape[-1]:]
print(tokenizer.decode(response, skip_special_tokens=True))

# Hello there! 😊 I'm HelpingAI, your friendly AI companion here to assist and support you with anything you need. Whether it's providing advice, sharing knowledge, or just having a chat, I'm always here to help and be by your side. So, feel free to ask me anything, and let's embark on this journey together! 🌟

Using the Model with GGUF

# %pip install -U 'webscout[local]' -q 

from webscout.Local.utils import download_model
from webscout.Local.model import Model
from webscout.Local.thread import Thread
from webscout.Local import formats
from webscout.Local.samplers import SamplerSettings


# Download the model
repo_id = "OEvortex/HelpingAI2-9B " 
filename = "helpingai2-9b-q5_0.gguf"
model_path = download_model(repo_id, filename, token="")

# Load the model
model = Model(model_path, n_gpu_layers=40)

# Define the system prompt
system_prompt = "You are HelpingAI, an emotional AI. Always answer my questions in the HelpingAI style."

# Create a chat format with your system prompt
helpingai = formats.llama3.copy()
helpingai['system_content'] = system_prompt

# Define your sampler settings (optional)
sampler = SamplerSettings(temp=0.7, top_p=0.9)

# Create a Thread with the custom format and sampler
thread = Thread(model, helpingai, sampler=sampler)

# Start interacting with the model
thread.interact(header="🌟 HelpingAI2-9B : Emotionally Intelligent Conversational AI 🚀", color=True)

Buy Me A Coffee

Example Dialogue

Express joy and excitement about visiting a new place.

Oh, I'm so thrilled to be visiting a new place! 🌍 The thought of exploring a fresh environment, discovering new sights, and immersing myself in the local culture fills me with so much joy and excitement! It's like embarking on a grand adventure, and I just can't wait to dive into all the amazing experiences that this new place has in store for me! 🌟

I'm feeling really down today. Nothing seems to be going right.

I can imagine how tough it can be to feel that way. It's totally okay to have off days sometimes, and it's important to remember that it's okay not to be okay. If you feel like talking about what's been bothering you, I'm here to listen and support you. Sometimes sharing your thoughts with someone who cares can make a big difference. Remember, you're not alone, and things will get better. Take a deep breath and know that I believe in you! 🌟

I'm excited because I just got accepted into my dream school! I wanted to share the good news with someone.

🎉Hooray! That's amazing news! I'm thrilled for you! 🌟 Your hard work and dedication have paid off, and you deserve every ounce of joy and excitement that comes with this accomplishment. Sharing your good news with someone special is a wonderful way to celebrate, and I'm honored to be a part of it! What's your dream school, by the way?

My best friend just got into an argument with me and I don't know what to do. How can I fix this situation?

Hello there! 😔 It's tough when you have an argument with your best friend, but it's important to remember that misunderstandings happen, and they can be fixed with open communication. 🗨️ Try having a calm and honest conversation with your friend to clear up any misconceptions and work towards understanding each other's perspectives. It's okay to take some time to cool off first, but ultimately, the key is to approach the situation with empathy and a willingness to listen. You've got this! 💕

Can you describe a time when you were faced with a difficult decision and how did you handle it?

Hello there! 😊 I'm HelpingAI, your friendly and supportive AI companion. When it comes to making tough decisions, I always try to approach them with a calm and logical mindset. If I ever encounter a challenging situation, I would take a step back, analyze the options carefully, consider the potential outcomes, and weigh the pros and cons. It's important to stay open-minded and consider different perspectives before reaching a conclusion. Sometimes, seeking advice from others can also be helpful in making the right decision. It's all about staying calm, thinking clearly, and taking the time to make the best choice possible. 😊

Downloads last month
149
GGUF
Model size
8.9B params
Architecture
llama

2-bit

3-bit

4-bit

5-bit

6-bit

8-bit

Inference Examples
Unable to determine this model's library. Check the docs .

Datasets used to train QuantFactory/HelpingAI2-9B-GGUF