Babi Models
Collection
3 items
•
Updated
Fine tune and evaluate transformer model on facebook's bAbi tasks.
Towards AI-Complete Question Answering: A Set of Prerequisite Toy Tasks
Training Code: p208p2002/bAbi-tasks-with-transformer-model
task_no | task_name | score |
---|---|---|
qa1 | single-supporting-fact | 100 |
qa2 | two-supporting-facts | 99.4 |
qa3 | three-supporting-facts | 62.0 |
qa4 | two-arg-relations | 100 |
qa5 | three-arg-relations | 96.7 |
qa6 | yes-no-questions | 100 |
qa7 | counting | 100 |
qa8 | lists-sets | 97.7 |
qa9 | simple-negation | 100 |
qa10 | indefinite-knowledge | 100 |
qa11 | basic-coreference | 100 |
qa12 | conjunction | 100 |
qa13 | compound-coreference | 100 |
qa14 | time-reasoning | 100 |
qa15 | basic-deduction | 100 |
qa16 | basic-induction | 100 |
qa17 | positional-reasoning | 100 |
qa18 | size-reasoning | 100 |
qa19 | path-finding | 100 |
qa20 | agents-motivations | 100 |
# Please use with the follow template
INPUT_TEMPLATE = """
Context:
{context}
Question:
{question}
Answer:
{answer}
"""
input_text = INPUT_TEMPLATE.format_map({
"context":context,
"question":question,
"answer":answer
}).strip()