bart-large-squad-qg-no-answer / eval /metric.middle.sentence.paragraph_answer.question.json
asahi417's picture
model update
80ac9d2
raw
history blame
392 Bytes
{"validation": {"Bleu_1": 0.5536249625262247, "Bleu_2": 0.3951173119309461, "Bleu_3": 0.30524855567627557, "Bleu_4": 0.24278193686898464, "METEOR": 0.2574020064203738, "ROUGE_L": 0.5133269425545001}, "test": {"Bleu_1": 0.5664093553442128, "Bleu_2": 0.4012541067066596, "Bleu_3": 0.3059161370702032, "Bleu_4": 0.23929033246506187, "METEOR": 0.25458372644354427, "ROUGE_L": 0.5089193096030656}}