YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

T5 for question-answering

This is T5-base model fine-tuned on SQuAD1.1 for QA using text-to-text approach

Model training

This model was trained on colab TPU with 35GB RAM for 4 epochs

Results:

Metric #Value
Exact Match 81.5610
F1 89.9601

Model in Action πŸš€

from transformers import AutoModelWithLMHead, AutoTokenizer

tokenizer = AutoTokenizer.from_pretrained("valhalla/t5-base-squad")
model = AutoModelWithLMHead.from_pretrained("valhalla/t5-base-squad")

def get_answer(question, context):
  input_text = "question: %s  context: %s </s>" % (question, context)
  features = tokenizer([input_text], return_tensors='pt')

  out = model.generate(input_ids=features['input_ids'], 
               attention_mask=features['attention_mask'])
  
  return tokenizer.decode(out[0])

context = "In Norse mythology, Valhalla is a majestic, enormous hall located in Asgard, ruled over by the god Odin."
question = "What is Valhalla ?"

get_answer(question, context)
# output: 'a majestic, enormous hall located in Asgard, ruled over by the god Odin'

Play with this model Open In Colab

Created by Suraj Patil Github icon Twitter icon

Downloads last month
15
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.