Phi-3.5B Finetuned Model

1. Introduction

This model is a finetuned version of the Phi-3.5B large language model. It is designed to provide detailed and accurate responses for university course-related queries. The model has been optimized to deliver insights into course details, fee structures, duration options, and campus locations, along with links to relevant course pages. The finetuning process utilized a domain-specific dataset to ensure precision and reliability.


2. Dataset Used for Finetuning

The finetuning of the Phi-3.5B model was performed using a private dataset created through web scraping. The data was collected from the University of Westminster website and included:

  • Course titles
  • Campus details
  • Duration options (full-time, part-time, distance learning)
  • Fee structures (for UK and international students)
  • Course descriptions
  • Direct links to course pages

The dataset was cleaned and structured to enhance the model's ability to generate accurate and context-aware responses.


3. How to Use This Model

To use the Phi-3.5B finetuned model, follow the steps below:

from transformers import TextStreamer

def chatml(question, model):
         messages = [{"role": "user", "content": question},]

         inputs = tokenizer.apply_chat_template(messages,
                                                tokenize=True,
                                                add_generation_prompt=True,
                                                return_tensors="pt",).to("cuda")

         text_streamer = TextStreamer(tokenizer, skip_special_tokens=True,
                                      skip_prompt=True)
         return model.generate(input_ids=inputs,
                               streamer=text_streamer,
                               max_new_tokens=512)
  
question = "Does the University of Westminster offer a course on AI, Data and Communication MA?"
x = query(question, model)

This setup ensures you can effectively query the Phi-3.5B finetuned model and receive detailed, relevant responses.


Uploaded model

  • Developed by: roger33303
  • License: apache-2.0
  • Finetuned from model : unsloth/phi-3.5-mini-instruct-bnb-4bit
Downloads last month
16
Safetensors
Model size
3.82B params
Tensor type
BF16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.