Gemma-2-9b Finetuned Model

1. Introduction

This model is a finetuned version of the Gemma-2-9b large language model, designed to provide precise and informative answers to questions related to university courses. It specializes in offering course details, including fees, duration, and campus information, while linking users to the appropriate pages for further exploration. The model has been finetuned with a private dataset to ensure high-quality, domain-specific responses.


2. Dataset Used for Finetuning

A private dataset was used to finetune this model. The dataset was meticulously created by web scraping course-related information from the University of Westminster website. The scraped data includes:

  • Course titles
  • Campus locations
  • Duration details (full-time, part-time, distance learning)
  • Fees (for UK and international students)
  • Course descriptions
  • Links to individual course pages

The data was cleaned, organized, and structured to enhance the model's performance in responding to domain-specific queries.


3. How to Use This Model

You can interact with the finetuned Gemma-2-9b model using the following Python code:

alpaca_prompt = """Below is an instruction that describes a task, paired with an input that provides further context. Write a response that appropriately completes the request.
### Instruction:
{}
### Input:
{}
### Response:
{}"""

from transformers import TextStreamer
def chatml(question, model):
    inputs = tokenizer([alpaca_prompt.format(question, "", "", )], return_tensors="pt").to("cuda")

    text_streamer = TextStreamer(tokenizer, skip_special_tokens=True,
                                 skip_prompt=True)

    return model.generate(**inputs, streamer=text_streamer,
                          max_new_tokens=512,
                          do_sample=True,
                          temperature=0.9,
                          top_p=0.5,
                          top_k=20,
                          repetition_penalty=1.1,
                          eos_token_id=tokenizer.eos_token_id,
                          use_cache=True,
                          )

# Function call
question = "Which course is related to AI and Communication at westminster?"
x = chatml(question,model)

This script defines a custom chatml function to query the model using a formatted Alpaca-style prompt. Replace question with your query and model with your loaded Gemma-2-9b model instance to receive a response.


Uploaded model

  • Developed by: roger33303
  • License: apache-2.0
  • Finetuned from model : unsloth/gemma-2-9b-bnb-4bit
Downloads last month
27
Safetensors
Model size
9.24B params
Tensor type
BF16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for roger33303/gemma-2-9b-Instruct-Finetune-website-QnA

Base model

google/gemma-2-9b
Finetuned
(298)
this model