Mistral-7B Finetuned Model

1. Introduction

This model is a finetuned version of the Mistral-7B large language model. It has been tailored to provide accurate and detailed responses to course-related queries. The model is optimized for answering questions regarding course details, fees, duration, and other specifics, along with linking users to relevant web pages for further information. The finetuning process utilized a custom dataset to ensure exceptional performance in this domain.


2. Dataset Used for Finetuning

A private dataset was employed to finetune the Mistral-7B model. The dataset was created through web scraping from the University of Westminster website. The scraped information includes:

  • Course titles
  • Campus details
  • Duration options (e.g., full-time, part-time, distance learning)
  • Fee structures (for UK and international students)
  • Descriptions of courses
  • Direct URLs to course pages

The dataset was preprocessed and structured to maximize the model's domain-specific utility and accuracy.


3. How to Use This Model

To use the finetuned Mistral-7B model, you can leverage the following Python code:

alpaca_prompt = """Below is an instruction that describes a task, paired with an input that provides further context. Write a response that appropriately completes the request.
### Instruction:
{}
### Input:
{}
### Response:
{}"""

from transformers import TextStreamer
def chatml(question, model):
    inputs = tokenizer([alpaca_prompt.format(question, "", "", )], return_tensors="pt").to("cuda")

    text_streamer = TextStreamer(tokenizer, skip_special_tokens=True,
                                 skip_prompt=True)

    return model.generate(**inputs, streamer=text_streamer,
                          max_new_tokens=512,
                          do_sample=True,
                          temperature=0.9,
                          top_p=0.5,
                          top_k=20,
                          repetition_penalty=1.1,
                          eos_token_id=tokenizer.eos_token_id,
                          use_cache=True,
                          )

# Function call
question = "Which course is related to AI and Communication at Westminster?"
x = chatml(question, model)

This script provides a chatml function that formats queries using an Alpaca-style prompt. Replace question with your query and model with the loaded Mistral-7B model instance to get a response.


Uploaded model

  • Developed by: roger33303
  • License: apache-2.0
  • Finetuned from model : unsloth/mistral-7b-v0.3-bnb-4bit
Downloads last month
2
Safetensors
Model size
7.25B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for roger33303/mistral-7b-Instruct-Finetune-website-QnA

Finetuned
(469)
this model