bart-large-squad-qg-no-answer / eval /metric.first.answer.paragraph_answer.question.json
asahi417's picture
model update
80ac9d2
raw
history blame
394 Bytes
{"validation": {"Bleu_1": 0.5136437721641545, "Bleu_2": 0.36129883780289335, "Bleu_3": 0.2767492909752304, "Bleu_4": 0.21892783038362384, "METEOR": 0.24306943173647258, "ROUGE_L": 0.4950254814279804}, "test": {"Bleu_1": 0.4892850716330892, "Bleu_2": 0.33554926957326286, "Bleu_3": 0.2516583105062437, "Bleu_4": 0.19483124274569025, "METEOR": 0.23359626643241924, "ROUGE_L": 0.4763053351560294}}