Edit model card

t5-small-squad-qg-v2

This model is a fine-tuned version of t5-small on the SQuAD dataset. It achieves the following results on the evaluation set:

  • Loss: 1.6608
  • BLEU: 20.00
  • Rouge1: 47.69
  • Rouge2: 26.43
  • RougeL: 44.15
  • RougeLSum: 44.15
  • METEOR: 45.84
  • BertScore: 91.82

Model description

Intended uses & limitations

  1. Define some useful functions for highlighting the answer in the paragraph and preparing the instruction prompt that will be fed to the model:
def highlight_answer(context, answer):
    context_splits = context.split(answer)
    
    text = ""
    for split in context_splits:
        text += split
        text += ' <h> '
        text += answer
        text += ' <h> '
        text += split
    
    return text

def prepare_instruction(answer_highlighted_context):
    instruction_prompt = f"""Generate a question whose answer is highlighted by <h> from the context delimited by the triple backticks.
    context:
    ```
    {answer_highlighted_context}
    ```
    """
    
    return instruction_prompt
  1. Use the model as a Hugging Face Pipeline:
from transformers import pipeline

pipe = pipeline('text2text-generation', model='mohammedaly22/t5-small-squad-qg-v2')

context = """During the 2011–12 season, he set the La Liga and European records\
for most goals scored in a single season, while establishing himself as Barcelona's\
all-time top scorer. The following two seasons, Messi finished second for the Ballon\
d'Or behind Cristiano Ronaldo (his perceived career rival), before regaining his best\
form during the 2014–15 campaign, becoming the all-time top scorer in La Liga and \
leading Barcelona to a historic second treble, after which he was awarded a fifth \
Ballon d'Or in 2015. Messi assumed captaincy of Barcelona in 2018, and won a record \
sixth Ballon d'Or in 2019. Out of contract, he signed for French club Paris Saint-Germain\
in August 2021, spending two seasons at the club and winning Ligue 1 twice. Messi \
joined American club Inter Miami in July 2023, winning the Leagues Cup in August of that year.
"""

answer_highlighted_context = highlight_answer(context=context, answer='Inter Miami')
prompt = prepare_instruction(answer_highlighted_context)

This will be the final prompt:

Generate a question whose answer is highlighted by <h> from the context delimited by the triple backticks
context:
```During the 2011–12 season, he set the La Liga and European records\
for most goals scored in a single season, while establishing himself as Barcelona's\
all-time top scorer. The following two seasons, Messi finished second for the Ballon\
d'Or behind Cristiano Ronaldo (his perceived career rival), before regaining his best\
form during the 2014–15 campaign, becoming the all-time top scorer in La Liga and \
leading Barcelona to a historic second treble, after which he was awarded a fifth \
Ballon d'Or in 2015. Messi assumed captaincy of Barcelona in 2018, and won a record\
 sixth Ballon d'Or in 2019. Out of contract, he signed for French club Paris Saint-Germain\
in August 2021, spending two seasons at the club and winning Ligue 1 twice. Messi \
joined American club  <h> Inter Miami <h> in July 2023, winning the Leagues Cup in August of that year.```
  1. Use the loaded pipeline to generate questions their answer is Inter Miami:
outputs = pipe(prompt, num_return_sequences=3, num_beams=5, num_beam_groups=5, diversity_penalty=1.0)
for output in outputs:
    print(output['generated_text'])

Result:

1. What club did Messi join in the 2023 season?
2. What was Messi's name of the club that won the Leagues Cup on July 20?
3. What club did Messi join in the Leagues Cup in July 2023?

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-06
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 1000
  • num_epochs: 10
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss
2.6867 0.73 500 1.9647
2.0737 1.46 1000 1.8141
1.9364 2.19 1500 1.7515
1.8745 2.92 2000 1.7215
1.8282 3.65 2500 1.7042
1.803 4.38 3000 1.6913
1.7797 5.11 3500 1.6796
1.7592 5.84 4000 1.6749
1.7435 6.57 4500 1.6697
1.7427 7.3 5000 1.6667
1.7245 8.04 5500 1.6614
1.7211 8.77 6000 1.6621
1.7137 9.5 6500 1.6608

Framework versions

  • Transformers 4.38.1
  • Pytorch 2.1.2
  • Datasets 2.13.1
  • Tokenizers 0.15.2
Downloads last month
22
Safetensors
Model size
60.5M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for mohammedaly22/t5-small-squad-qg-v2

Base model

google-t5/t5-small
Finetuned
(1530)
this model

Dataset used to train mohammedaly22/t5-small-squad-qg-v2

Evaluation results