Edit model card

bert-base-finetuned-squad2

Model description

This model is based on bert-base-uncased and was finetuned on SQuAD2.0. The corresponding papers you can found here (model) and here (data).

How to use

from transformers.pipelines import pipeline

model_name = "phiyodr/bert-base-finetuned-squad2"
nlp = pipeline('question-answering', model=model_name, tokenizer=model_name)
inputs = {
    'question': 'What discipline did Winkelmann create?',
    'context': 'Johann Joachim Winckelmann was a German art historian and archaeologist. He was a pioneering Hellenist who first articulated the difference between Greek, Greco-Roman and Roman art. "The prophet and founding hero of modern archaeology", Winckelmann was one of the founders of scientific archaeology and first applied the categories of style on a large, systematic basis to the history of art. '
}
nlp(inputs)

Training procedure

{
    "base_model": "bert-base-uncased",
    "do_lower_case": True,
    "learning_rate": 3e-5,
    "num_train_epochs": 4,
    "max_seq_length": 384,
    "doc_stride": 128,
    "max_query_length": 64,
    "batch_size": 96 
}

Eval results

{
  "exact": 70.3950138970774,
  "f1": 73.90527661873521,
  "total": 11873,
  "HasAns_exact": 71.4574898785425,
  "HasAns_f1": 78.48808186475087,
  "HasAns_total": 5928,
  "NoAns_exact": 69.33557611438184,
  "NoAns_f1": 69.33557611438184,
  "NoAns_total": 5945
}
Downloads last month
160
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for phiyodr/bert-base-finetuned-squad2

Finetunes
2 models

Spaces using phiyodr/bert-base-finetuned-squad2 2