Muthukumaran's picture
Update README.md
95c01d3 verified
metadata
license: cc
task_categories:
  - question-answering
language:
  - en
tags:
  - climate
  - chemistry
  - biology
  - earth science
pretty_name: NASA-QA

NASA SMD and IBM research developed NASA-QA, an extractive question answering task focused on the Earth science domain. First, 39 paragraphs from Earth science papers which appeared in AGU and AMS journals were sourced. Subject matter experts from NASA formulated questions and marked the corresponding answers in these paragraphs, resulting in a total of 117 question-answer pairs. The dataset is split into a training set of 90 pairs and a validation set of 27 pairs. The average length of the questions is 11 words, and the average length of the paragraphs is 150 words. The evaluation metric used for this task is F1 score, measuring the overlap between predicted and ground truth answers.

Evaluation Metrics image/png

Note

This dataset is released in support of the training and evaluation of the encoder language model "Indus".

Accompanying paper can be found here: https://arxiv.org/abs/2405.10725