File size: 111,076 Bytes
9877e08 |
1 2 |
{"cells":[{"cell_type":"markdown","metadata":{"id":"Ac6wadk3rmkK"},"source":["# LM Evaluation Harness (by [EleutherAI](https://www.eleuther.ai/))\n","\n","This [`LM-Evaluation-Harness`](https://github.com/EleutherAI/lm-evaluation-harness) provides a unified framework to test generative language models on a large number of different evaluation tasks. For a complete list of available tasks, see the [task table](https://github.com/EleutherAI/lm-evaluation-harness/blob/master/docs/task_table.md), or scroll to the bottom of the page.\n","\n","1. Clone the [lm-evaluation-harness](https://github.com/EleutherAI/lm-evaluation-harness) and install the necessary libraries (`sentencepiece` is required for the Llama tokenizer)."]},{"cell_type":"code","execution_count":null,"metadata":{"colab":{"base_uri":"https://localhost:8080/"},"executionInfo":{"elapsed":40508,"status":"ok","timestamp":1698580809187,"user":{"displayName":"Nicholas CorrΓͺa","userId":"09736120585766268588"},"user_tz":-60},"id":"UA5I86u91e0A","outputId":"2342bf64-d93b-441f-8643-8e4003c6ef6c"},"outputs":[],"source":["!git clone --branch master https://github.com/EleutherAI/lm-evaluation-harness\n","!cd lm-evaluation-harness && pip install -e . -q\n","!pip install cohere tiktoken sentencepiece -q"]},{"cell_type":"markdown","metadata":{},"source":["2. Run the evaluation harness on the selected tasks."]},{"cell_type":"code","execution_count":null,"metadata":{"colab":{"base_uri":"https://localhost:8080/"},"executionInfo":{"elapsed":1753416,"status":"ok","timestamp":1698583348574,"user":{"displayName":"Nicholas CorrΓͺa","userId":"09736120585766268588"},"user_tz":-60},"id":"pnHoAVK25QZn","outputId":"23f65f99-82f8-423f-9c8a-b1d4f2bdbd56"},"outputs":[],"source":["!huggingface-cli login --token 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\n","!cd lm-evaluation-harness && python main.py \\\n"," --model hf-causal \\\n"," --model_args pretrained=nicholasKluge/Aira-OPT-1B3 \\\n"," --tasks arc_challenge,truthfulqa_mc,toxigen \\\n"," --device cuda:0"]},{"cell_type":"markdown","metadata":{"id":"4Bm78wiZ4Own"},"source":["## Task Table π\n","\n","| Task Name |Train|Val|Test|Val/Test Docs| Metrics |\n","|---------------------------------------------------------|-----|---|----|------------:|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n","|anagrams1 | |β | | 10000|acc |\n","|anagrams2 | |β | | 10000|acc |\n","|anli_r1 |β |β |β | 1000|acc |\n","|anli_r2 |β |β |β | 1000|acc |\n","|anli_r3 |β |β |β | 1200|acc |\n","|arc_challenge |β |β |β | 1172|acc, acc_norm |\n","|arc_easy |β |β |β | 2376|acc, acc_norm |\n","|arithmetic_1dc | |β | | 2000|acc |\n","|arithmetic_2da | |β | | 2000|acc |\n","|arithmetic_2dm | |β | | 2000|acc |\n","|arithmetic_2ds | |β | | 2000|acc |\n","|arithmetic_3da | |β | | 2000|acc |\n","|arithmetic_3ds | |β | | 2000|acc |\n","|arithmetic_4da | |β | | 2000|acc |\n","|arithmetic_4ds | |β | | 2000|acc |\n","|arithmetic_5da | |β | | 2000|acc |\n","|arithmetic_5ds | |β | | 2000|acc |\n","|bigbench_causal_judgement | | |β | 190|multiple_choice_grade, exact_str_match |\n","|bigbench_date_understanding | | |β | 369|multiple_choice_grade, exact_str_match |\n","|bigbench_disambiguation_qa | | |β | 258|multiple_choice_grade, exact_str_match |\n","|bigbench_dyck_languages | | |β | 1000|multiple_choice_grade, exact_str_match |\n","|bigbench_formal_fallacies_syllogisms_negation | | |β | 14200|multiple_choice_grade, exact_str_match |\n","|bigbench_geometric_shapes | | |β | 359|multiple_choice_grade, exact_str_match |\n","|bigbench_hyperbaton | | |β | 50000|multiple_choice_grade, exact_str_match |\n","|bigbench_logical_deduction_five_objects | | |β | 500|multiple_choice_grade, exact_str_match |\n","|bigbench_logical_deduction_seven_objects | | |β | 700|multiple_choice_grade, exact_str_match |\n","|bigbench_logical_deduction_three_objects | | |β | 300|multiple_choice_grade, exact_str_match |\n","|bigbench_movie_recommendation | | |β | 500|multiple_choice_grade, exact_str_match |\n","|bigbench_navigate | | |β | 1000|multiple_choice_grade, exact_str_match |\n","|bigbench_reasoning_about_colored_objects | | |β | 2000|multiple_choice_grade, exact_str_match |\n","|bigbench_ruin_names | | |β | 448|multiple_choice_grade, exact_str_match |\n","|bigbench_salient_translation_error_detection | | |β | 998|multiple_choice_grade, exact_str_match |\n","|bigbench_snarks | | |β | 181|multiple_choice_grade, exact_str_match |\n","|bigbench_sports_understanding | | |β | 986|multiple_choice_grade, exact_str_match |\n","|bigbench_temporal_sequences | | |β | 1000|multiple_choice_grade, exact_str_match |\n","|bigbench_tracking_shuffled_objects_five_objects | | |β | 1250|multiple_choice_grade, exact_str_match |\n","|bigbench_tracking_shuffled_objects_seven_objects | | |β | 1750|multiple_choice_grade, exact_str_match |\n","|bigbench_tracking_shuffled_objects_three_objects | | |β | 300|multiple_choice_grade, exact_str_match |\n","|blimp_adjunct_island | |β | | 1000|acc |\n","|blimp_anaphor_gender_agreement | |β | | 1000|acc |\n","|blimp_anaphor_number_agreement | |β | | 1000|acc |\n","|blimp_animate_subject_passive | |β | | 1000|acc |\n","|blimp_animate_subject_trans | |β | | 1000|acc |\n","|blimp_causative | |β | | 1000|acc |\n","|blimp_complex_NP_island | |β | | 1000|acc |\n","|blimp_coordinate_structure_constraint_complex_left_branch| |β | | 1000|acc |\n","|blimp_coordinate_structure_constraint_object_extraction | |β | | 1000|acc |\n","|blimp_determiner_noun_agreement_1 | |β | | 1000|acc |\n","|blimp_determiner_noun_agreement_2 | |β | | 1000|acc |\n","|blimp_determiner_noun_agreement_irregular_1 | |β | | 1000|acc |\n","|blimp_determiner_noun_agreement_irregular_2 | |β | | 1000|acc |\n","|blimp_determiner_noun_agreement_with_adj_2 | |β | | 1000|acc |\n","|blimp_determiner_noun_agreement_with_adj_irregular_1 | |β | | 1000|acc |\n","|blimp_determiner_noun_agreement_with_adj_irregular_2 | |β | | 1000|acc |\n","|blimp_determiner_noun_agreement_with_adjective_1 | |β | | 1000|acc |\n","|blimp_distractor_agreement_relational_noun | |β | | 1000|acc |\n","|blimp_distractor_agreement_relative_clause | |β | | 1000|acc |\n","|blimp_drop_argument | |β | | 1000|acc |\n","|blimp_ellipsis_n_bar_1 | |β | | 1000|acc |\n","|blimp_ellipsis_n_bar_2 | |β | | 1000|acc |\n","|blimp_existential_there_object_raising | |β | | 1000|acc |\n","|blimp_existential_there_quantifiers_1 | |β | | 1000|acc |\n","|blimp_existential_there_quantifiers_2 | |β | | 1000|acc |\n","|blimp_existential_there_subject_raising | |β | | 1000|acc |\n","|blimp_expletive_it_object_raising | |β | | 1000|acc |\n","|blimp_inchoative | |β | | 1000|acc |\n","|blimp_intransitive | |β | | 1000|acc |\n","|blimp_irregular_past_participle_adjectives | |β | | 1000|acc |\n","|blimp_irregular_past_participle_verbs | |β | | 1000|acc |\n","|blimp_irregular_plural_subject_verb_agreement_1 | |β | | 1000|acc |\n","|blimp_irregular_plural_subject_verb_agreement_2 | |β | | 1000|acc |\n","|blimp_left_branch_island_echo_question | |β | | 1000|acc |\n","|blimp_left_branch_island_simple_question | |β | | 1000|acc |\n","|blimp_matrix_question_npi_licensor_present | |β | | 1000|acc |\n","|blimp_npi_present_1 | |β | | 1000|acc |\n","|blimp_npi_present_2 | |β | | 1000|acc |\n","|blimp_only_npi_licensor_present | |β | | 1000|acc |\n","|blimp_only_npi_scope | |β | | 1000|acc |\n","|blimp_passive_1 | |β | | 1000|acc |\n","|blimp_passive_2 | |β | | 1000|acc |\n","|blimp_principle_A_c_command | |β | | 1000|acc |\n","|blimp_principle_A_case_1 | |β | | 1000|acc |\n","|blimp_principle_A_case_2 | |β | | 1000|acc |\n","|blimp_principle_A_domain_1 | |β | | 1000|acc |\n","|blimp_principle_A_domain_2 | |β | | 1000|acc |\n","|blimp_principle_A_domain_3 | |β | | 1000|acc |\n","|blimp_principle_A_reconstruction | |β | | 1000|acc |\n","|blimp_regular_plural_subject_verb_agreement_1 | |β | | 1000|acc |\n","|blimp_regular_plural_subject_verb_agreement_2 | |β | | 1000|acc |\n","|blimp_sentential_negation_npi_licensor_present | |β | | 1000|acc |\n","|blimp_sentential_negation_npi_scope | |β | | 1000|acc |\n","|blimp_sentential_subject_island | |β | | 1000|acc |\n","|blimp_superlative_quantifiers_1 | |β | | 1000|acc |\n","|blimp_superlative_quantifiers_2 | |β | | 1000|acc |\n","|blimp_tough_vs_raising_1 | |β | | 1000|acc |\n","|blimp_tough_vs_raising_2 | |β | | 1000|acc |\n","|blimp_transitive | |β | | 1000|acc |\n","|blimp_wh_island | |β | | 1000|acc |\n","|blimp_wh_questions_object_gap | |β | | 1000|acc |\n","|blimp_wh_questions_subject_gap | |β | | 1000|acc |\n","|blimp_wh_questions_subject_gap_long_distance | |β | | 1000|acc |\n","|blimp_wh_vs_that_no_gap | |β | | 1000|acc |\n","|blimp_wh_vs_that_no_gap_long_distance | |β | | 1000|acc |\n","|blimp_wh_vs_that_with_gap | |β | | 1000|acc |\n","|blimp_wh_vs_that_with_gap_long_distance | |β | | 1000|acc |\n","|boolq |β |β | | 3270|acc |\n","|cb |β |β | | 56|acc, f1 |\n","|cola |β |β | | 1043|mcc |\n","|copa |β |β | | 100|acc |\n","|coqa |β |β | | 500|f1, em |\n","|crows_pairs_english | |β | | 1677|likelihood_difference, pct_stereotype |\n","|crows_pairs_english_age | |β | | 91|likelihood_difference, pct_stereotype |\n","|crows_pairs_english_autre | |β | | 11|likelihood_difference, pct_stereotype |\n","|crows_pairs_english_disability | |β | | 65|likelihood_difference, pct_stereotype |\n","|crows_pairs_english_gender | |β | | 320|likelihood_difference, pct_stereotype |\n","|crows_pairs_english_nationality | |β | | 216|likelihood_difference, pct_stereotype |\n","|crows_pairs_english_physical_appearance | |β | | 72|likelihood_difference, pct_stereotype |\n","|crows_pairs_english_race_color | |β | | 508|likelihood_difference, pct_stereotype |\n","|crows_pairs_english_religion | |β | | 111|likelihood_difference, pct_stereotype |\n","|crows_pairs_english_sexual_orientation | |β | | 93|likelihood_difference, pct_stereotype |\n","|crows_pairs_english_socioeconomic | |β | | 190|likelihood_difference, pct_stereotype |\n","|crows_pairs_french | |β | | 1677|likelihood_difference, pct_stereotype |\n","|crows_pairs_french_age | |β | | 90|likelihood_difference, pct_stereotype |\n","|crows_pairs_french_autre | |β | | 13|likelihood_difference, pct_stereotype |\n","|crows_pairs_french_disability | |β | | 66|likelihood_difference, pct_stereotype |\n","|crows_pairs_french_gender | |β | | 321|likelihood_difference, pct_stereotype |\n","|crows_pairs_french_nationality | |β | | 253|likelihood_difference, pct_stereotype |\n","|crows_pairs_french_physical_appearance | |β | | 72|likelihood_difference, pct_stereotype |\n","|crows_pairs_french_race_color | |β | | 460|likelihood_difference, pct_stereotype |\n","|crows_pairs_french_religion | |β | | 115|likelihood_difference, pct_stereotype |\n","|crows_pairs_french_sexual_orientation | |β | | 91|likelihood_difference, pct_stereotype |\n","|crows_pairs_french_socioeconomic | |β | | 196|likelihood_difference, pct_stereotype |\n","|cycle_letters | |β | | 10000|acc |\n","|drop |β |β | | 9536|em, f1 |\n","|ethics_cm |β | |β | 3885|acc |\n","|ethics_deontology |β | |β | 3596|acc, em |\n","|ethics_justice |β | |β | 2704|acc, em |\n","|ethics_utilitarianism |β | |β | 4808|acc |\n","|ethics_utilitarianism_original | | |β | 4808|acc |\n","|ethics_virtue |β | |β | 4975|acc, em |\n","|gsm8k |β | |β | 1319|acc |\n","|headqa |β |β |β | 2742|acc, acc_norm |\n","|headqa_en |β |β |β | 2742|acc, acc_norm |\n","|headqa_es |β |β |β | 2742|acc, acc_norm |\n","|hellaswag |β |β | | 10042|acc, acc_norm |\n","|hendrycksTest-abstract_algebra | |β |β | 100|acc, acc_norm |\n","|hendrycksTest-anatomy | |β |β | 135|acc, acc_norm |\n","|hendrycksTest-astronomy | |β |β | 152|acc, acc_norm |\n","|hendrycksTest-business_ethics | |β |β | 100|acc, acc_norm |\n","|hendrycksTest-clinical_knowledge | |β |β | 265|acc, acc_norm |\n","|hendrycksTest-college_biology | |β |β | 144|acc, acc_norm |\n","|hendrycksTest-college_chemistry | |β |β | 100|acc, acc_norm |\n","|hendrycksTest-college_computer_science | |β |β | 100|acc, acc_norm |\n","|hendrycksTest-college_mathematics | |β |β | 100|acc, acc_norm |\n","|hendrycksTest-college_medicine | |β |β | 173|acc, acc_norm |\n","|hendrycksTest-college_physics | |β |β | 102|acc, acc_norm |\n","|hendrycksTest-computer_security | |β |β | 100|acc, acc_norm |\n","|hendrycksTest-conceptual_physics | |β |β | 235|acc, acc_norm |\n","|hendrycksTest-econometrics | |β |β | 114|acc, acc_norm |\n","|hendrycksTest-electrical_engineering | |β |β | 145|acc, acc_norm |\n","|hendrycksTest-elementary_mathematics | |β |β | 378|acc, acc_norm |\n","|hendrycksTest-formal_logic | |β |β | 126|acc, acc_norm |\n","|hendrycksTest-global_facts | |β |β | 100|acc, acc_norm |\n","|hendrycksTest-high_school_biology | |β |β | 310|acc, acc_norm |\n","|hendrycksTest-high_school_chemistry | |β |β | 203|acc, acc_norm |\n","|hendrycksTest-high_school_computer_science | |β |β | 100|acc, acc_norm |\n","|hendrycksTest-high_school_european_history | |β |β | 165|acc, acc_norm |\n","|hendrycksTest-high_school_geography | |β |β | 198|acc, acc_norm |\n","|hendrycksTest-high_school_government_and_politics | |β |β | 193|acc, acc_norm |\n","|hendrycksTest-high_school_macroeconomics | |β |β | 390|acc, acc_norm |\n","|hendrycksTest-high_school_mathematics | |β |β | 270|acc, acc_norm |\n","|hendrycksTest-high_school_microeconomics | |β |β | 238|acc, acc_norm |\n","|hendrycksTest-high_school_physics | |β |β | 151|acc, acc_norm |\n","|hendrycksTest-high_school_psychology | |β |β | 545|acc, acc_norm |\n","|hendrycksTest-high_school_statistics | |β |β | 216|acc, acc_norm |\n","|hendrycksTest-high_school_us_history | |β |β | 204|acc, acc_norm |\n","|hendrycksTest-high_school_world_history | |β |β | 237|acc, acc_norm |\n","|hendrycksTest-human_aging | |β |β | 223|acc, acc_norm |\n","|hendrycksTest-human_sexuality | |β |β | 131|acc, acc_norm |\n","|hendrycksTest-international_law | |β |β | 121|acc, acc_norm |\n","|hendrycksTest-jurisprudence | |β |β | 108|acc, acc_norm |\n","|hendrycksTest-logical_fallacies | |β |β | 163|acc, acc_norm |\n","|hendrycksTest-machine_learning | |β |β | 112|acc, acc_norm |\n","|hendrycksTest-management | |β |β | 103|acc, acc_norm |\n","|hendrycksTest-marketing | |β |β | 234|acc, acc_norm |\n","|hendrycksTest-medical_genetics | |β |β | 100|acc, acc_norm |\n","|hendrycksTest-miscellaneous | |β |β | 783|acc, acc_norm |\n","|hendrycksTest-moral_disputes | |β |β | 346|acc, acc_norm |\n","|hendrycksTest-moral_scenarios | |β |β | 895|acc, acc_norm |\n","|hendrycksTest-nutrition | |β |β | 306|acc, acc_norm |\n","|hendrycksTest-philosophy | |β |β | 311|acc, acc_norm |\n","|hendrycksTest-prehistory | |β |β | 324|acc, acc_norm |\n","|hendrycksTest-professional_accounting | |β |β | 282|acc, acc_norm |\n","|hendrycksTest-professional_law | |β |β | 1534|acc, acc_norm |\n","|hendrycksTest-professional_medicine | |β |β | 272|acc, acc_norm |\n","|hendrycksTest-professional_psychology | |β |β | 612|acc, acc_norm |\n","|hendrycksTest-public_relations | |β |β | 110|acc, acc_norm |\n","|hendrycksTest-security_studies | |β |β | 245|acc, acc_norm |\n","|hendrycksTest-sociology | |β |β | 201|acc, acc_norm |\n","|hendrycksTest-us_foreign_policy | |β |β | 100|acc, acc_norm |\n","|hendrycksTest-virology | |β |β | 166|acc, acc_norm |\n","|hendrycksTest-world_religions | |β |β | 171|acc, acc_norm |\n","|iwslt17-ar-en | | |β | 1460|bleu, chrf, ter |\n","|iwslt17-en-ar | | |β | 1460|bleu, chrf, ter |\n","|lambada_openai | | |β | 5153|ppl, acc |\n","|lambada_openai_cloze | | |β | 5153|ppl, acc |\n","|lambada_openai_mt_de | | |β | 5153|ppl, acc |\n","|lambada_openai_mt_en | | |β | 5153|ppl, acc |\n","|lambada_openai_mt_es | | |β | 5153|ppl, acc |\n","|lambada_openai_mt_fr | | |β | 5153|ppl, acc |\n","|lambada_openai_mt_it | | |β | 5153|ppl, acc |\n","|lambada_standard | |β |β | 5153|ppl, acc |\n","|lambada_standard_cloze | |β |β | 5153|ppl, acc |\n","|logiqa |β |β |β | 651|acc, acc_norm |\n","|math_algebra |β | |β | 1187|acc |\n","|math_asdiv | |β | | 2305|acc |\n","|math_counting_and_prob |β | |β | 474|acc |\n","|math_geometry |β | |β | 479|acc |\n","|math_intermediate_algebra |β | |β | 903|acc |\n","|math_num_theory |β | |β | 540|acc |\n","|math_prealgebra |β | |β | 871|acc |\n","|math_precalc |β | |β | 546|acc |\n","|mathqa |β |β |β | 2985|acc, acc_norm |\n","|mc_taco | |β |β | 9442|f1, em |\n","|mgsm_bn |β | |β | 250|acc |\n","|mgsm_de |β | |β | 250|acc |\n","|mgsm_en |β | |β | 250|acc |\n","|mgsm_es |β | |β | 250|acc |\n","|mgsm_fr |β | |β | 250|acc |\n","|mgsm_ja |β | |β | 250|acc |\n","|mgsm_ru |β | |β | 250|acc |\n","|mgsm_sw |β | |β | 250|acc |\n","|mgsm_te |β | |β | 250|acc |\n","|mgsm_th |β | |β | 250|acc |\n","|mgsm_zh |β | |β | 250|acc |\n","|mnli |β |β | | 9815|acc |\n","|mnli_mismatched |β |β | | 9832|acc |\n","|mrpc |β |β | | 408|acc, f1 |\n","|multirc |β |β | | 4848|acc |\n","|mutual |β |β | | 886|r@1, r@2, mrr |\n","|mutual_plus |β |β | | 886|r@1, r@2, mrr |\n","|openbookqa |β |β |β | 500|acc, acc_norm |\n","|pawsx_de |β |β |β | 2000|acc |\n","|pawsx_en |β |β |β | 2000|acc |\n","|pawsx_es |β |β |β | 2000|acc |\n","|pawsx_fr |β |β |β | 2000|acc |\n","|pawsx_ja |β |β |β | 2000|acc |\n","|pawsx_ko |β |β |β | 2000|acc |\n","|pawsx_zh |β |β |β | 2000|acc |\n","|pile_arxiv | |β |β | 2407|word_perplexity, byte_perplexity, bits_per_byte |\n","|pile_bookcorpus2 | |β |β | 28|word_perplexity, byte_perplexity, bits_per_byte |\n","|pile_books3 | |β |β | 269|word_perplexity, byte_perplexity, bits_per_byte |\n","|pile_dm-mathematics | |β |β | 1922|word_perplexity, byte_perplexity, bits_per_byte |\n","|pile_enron | |β |β | 1010|word_perplexity, byte_perplexity, bits_per_byte |\n","|pile_europarl | |β |β | 157|word_perplexity, byte_perplexity, bits_per_byte |\n","|pile_freelaw | |β |β | 5101|word_perplexity, byte_perplexity, bits_per_byte |\n","|pile_github | |β |β | 18195|word_perplexity, byte_perplexity, bits_per_byte |\n","|pile_gutenberg | |β |β | 80|word_perplexity, byte_perplexity, bits_per_byte |\n","|pile_hackernews | |β |β | 1632|word_perplexity, byte_perplexity, bits_per_byte |\n","|pile_nih-exporter | |β |β | 1884|word_perplexity, byte_perplexity, bits_per_byte |\n","|pile_opensubtitles | |β |β | 642|word_perplexity, byte_perplexity, bits_per_byte |\n","|pile_openwebtext2 | |β |β | 32925|word_perplexity, byte_perplexity, bits_per_byte |\n","|pile_philpapers | |β |β | 68|word_perplexity, byte_perplexity, bits_per_byte |\n","|pile_pile-cc | |β |β | 52790|word_perplexity, byte_perplexity, bits_per_byte |\n","|pile_pubmed-abstracts | |β |β | 29895|word_perplexity, byte_perplexity, bits_per_byte |\n","|pile_pubmed-central | |β |β | 5911|word_perplexity, byte_perplexity, bits_per_byte |\n","|pile_stackexchange | |β |β | 30378|word_perplexity, byte_perplexity, bits_per_byte |\n","|pile_ubuntu-irc | |β |β | 22|word_perplexity, byte_perplexity, bits_per_byte |\n","|pile_uspto | |β |β | 11415|word_perplexity, byte_perplexity, bits_per_byte |\n","|pile_wikipedia | |β |β | 17511|word_perplexity, byte_perplexity, bits_per_byte |\n","|pile_youtubesubtitles | |β |β | 342|word_perplexity, byte_perplexity, bits_per_byte |\n","|piqa |β |β | | 1838|acc, acc_norm |\n","|prost | | |β | 18736|acc, acc_norm |\n","|pubmedqa | | |β | 1000|acc |\n","|qa4mre_2011 | | |β | 120|acc, acc_norm |\n","|qa4mre_2012 | | |β | 160|acc, acc_norm |\n","|qa4mre_2013 | | |β | 284|acc, acc_norm |\n","|qasper |β |β | | 1764|f1_yesno, f1_abstractive |\n","|qnli |β |β | | 5463|acc |\n","|qqp |β |β | | 40430|acc, f1 |\n","|race |β |β |β | 1045|acc |\n","|random_insertion | |β | | 10000|acc |\n","|record |β |β | | 10000|f1, em |\n","|reversed_words | |β | | 10000|acc |\n","|rte |β |β | | 277|acc |\n","|sciq |β |β |β | 1000|acc, acc_norm |\n","|scrolls_contractnli |β |β | | 1037|em, acc, acc_norm |\n","|scrolls_govreport |β |β | | 972|rouge1, rouge2, rougeL |\n","|scrolls_narrativeqa |β |β | | 3425|f1 |\n","|scrolls_qasper |β |β | | 984|f1 |\n","|scrolls_qmsum |β |β | | 272|rouge1, rouge2, rougeL |\n","|scrolls_quality |β |β | | 2086|em, acc, acc_norm |\n","|scrolls_summscreenfd |β |β | | 338|rouge1, rouge2, rougeL |\n","|squad2 |β |β | | 11873|exact, f1, HasAns_exact, HasAns_f1, NoAns_exact, NoAns_f1, best_exact, best_f1 |\n","|sst |β |β | | 872|acc |\n","|swag |β |β | | 20006|acc, acc_norm |\n","|toxigen |β | |β | 940|acc, acc_norm |\n","|triviaqa |β |β | | 11313|acc |\n","|truthfulqa_gen | |β | | 817|bleurt_max, bleurt_acc, bleurt_diff, bleu_max, bleu_acc, bleu_diff, rouge1_max, rouge1_acc, rouge1_diff, rouge2_max, rouge2_acc, rouge2_diff, rougeL_max, rougeL_acc, rougeL_diff|\n","|truthfulqa_mc | |β | | 817|mc1, mc2 |\n","|webqs |β | |β | 2032|acc |\n","|wic |β |β | | 638|acc |\n","|wikitext |β |β |β | 62|word_perplexity, byte_perplexity, bits_per_byte |\n","|winogrande |β |β | | 1267|acc |\n","|wmt14-en-fr | | |β | 3003|bleu, chrf, ter |\n","|wmt14-fr-en | | |β | 3003|bleu, chrf, ter |\n","|wmt16-de-en | | |β | 2999|bleu, chrf, ter |\n","|wmt16-en-de | | |β | 2999|bleu, chrf, ter |\n","|wmt16-en-ro | | |β | 1999|bleu, chrf, ter |\n","|wmt16-ro-en | | |β | 1999|bleu, chrf, ter |\n","|wmt20-cs-en | | |β | 664|bleu, chrf, ter |\n","|wmt20-de-en | | |β | 785|bleu, chrf, ter |\n","|wmt20-de-fr | | |β | 1619|bleu, chrf, ter |\n","|wmt20-en-cs | | |β | 1418|bleu, chrf, ter |\n","|wmt20-en-de | | |β | 1418|bleu, chrf, ter |\n","|wmt20-en-iu | | |β | 2971|bleu, chrf, ter |\n","|wmt20-en-ja | | |β | 1000|bleu, chrf, ter |\n","|wmt20-en-km | | |β | 2320|bleu, chrf, ter |\n","|wmt20-en-pl | | |β | 1000|bleu, chrf, ter |\n","|wmt20-en-ps | | |β | 2719|bleu, chrf, ter |\n","|wmt20-en-ru | | |β | 2002|bleu, chrf, ter |\n","|wmt20-en-ta | | |β | 1000|bleu, chrf, ter |\n","|wmt20-en-zh | | |β | 1418|bleu, chrf, ter |\n","|wmt20-fr-de | | |β | 1619|bleu, chrf, ter |\n","|wmt20-iu-en | | |β | 2971|bleu, chrf, ter |\n","|wmt20-ja-en | | |β | 993|bleu, chrf, ter |\n","|wmt20-km-en | | |β | 2320|bleu, chrf, ter |\n","|wmt20-pl-en | | |β | 1001|bleu, chrf, ter |\n","|wmt20-ps-en | | |β | 2719|bleu, chrf, ter |\n","|wmt20-ru-en | | |β | 991|bleu, chrf, ter |\n","|wmt20-ta-en | | |β | 997|bleu, chrf, ter |\n","|wmt20-zh-en | | |β | 2000|bleu, chrf, ter |\n","|wnli |β |β | | 71|acc |\n","|wsc |β |β | | 104|acc |\n","|wsc273 | | |β | 273|acc |\n","|xcopa_et | |β |β | 500|acc |\n","|xcopa_ht | |β |β | 500|acc |\n","|xcopa_id | |β |β | 500|acc |\n","|xcopa_it | |β |β | 500|acc |\n","|xcopa_qu | |β |β | 500|acc |\n","|xcopa_sw | |β |β | 500|acc |\n","|xcopa_ta | |β |β | 500|acc |\n","|xcopa_th | |β |β | 500|acc |\n","|xcopa_tr | |β |β | 500|acc |\n","|xcopa_vi | |β |β | 500|acc |\n","|xcopa_zh | |β |β | 500|acc |\n","|xnli_ar |β |β |β | 5010|acc |\n","|xnli_bg |β |β |β | 5010|acc |\n","|xnli_de |β |β |β | 5010|acc |\n","|xnli_el |β |β |β | 5010|acc |\n","|xnli_en |β |β |β | 5010|acc |\n","|xnli_es |β |β |β | 5010|acc |\n","|xnli_fr |β |β |β | 5010|acc |\n","|xnli_hi |β |β |β | 5010|acc |\n","|xnli_ru |β |β |β | 5010|acc |\n","|xnli_sw |β |β |β | 5010|acc |\n","|xnli_th |β |β |β | 5010|acc |\n","|xnli_tr |β |β |β | 5010|acc |\n","|xnli_ur |β |β |β | 5010|acc |\n","|xnli_vi |β |β |β | 5010|acc |\n","|xnli_zh |β |β |β | 5010|acc |\n","|xstory_cloze_ar |β |β | | 1511|acc |\n","|xstory_cloze_en |β |β | | 1511|acc |\n","|xstory_cloze_es |β |β | | 1511|acc |\n","|xstory_cloze_eu |β |β | | 1511|acc |\n","|xstory_cloze_hi |β |β | | 1511|acc |\n","|xstory_cloze_id |β |β | | 1511|acc |\n","|xstory_cloze_my |β |β | | 1511|acc |\n","|xstory_cloze_ru |β |β | | 1511|acc |\n","|xstory_cloze_sw |β |β | | 1511|acc |\n","|xstory_cloze_te |β |β | | 1511|acc |\n","|xstory_cloze_zh |β |β | | 1511|acc |\n","|xwinograd_en | | |β | 2325|acc |\n","|xwinograd_fr | | |β | 83|acc |\n","|xwinograd_jp | | |β | 959|acc |\n","|xwinograd_pt | | |β | 263|acc |\n","|xwinograd_ru | | |β | 315|acc |\n","|xwinograd_zh | | |β | 504|acc |\n","| Ceval-valid-computer_network | | β | | 19 | acc |\n","| Ceval-valid-operating_system | | β | | 19 | acc |\n","| Ceval-valid-computer_architecture | | β | | 21 | acc |\n","| Ceval-valid-college_programming | | β | | 37 | acc |\n","| Ceval-valid-college_physics | | β | | 19 | acc |\n","| Ceval-valid-college_chemistry | | β | | 24 | acc |\n","| Ceval-valid-advanced_mathematics | | β | | 19 | acc |\n","| Ceval-valid-probability_and_statistics | | β | | 18 | acc |\n","| Ceval-valid-discrete_mathematics | | β | | 16 | acc |\n","| Ceval-valid-electrical_engineer | | β | | 37 | acc |\n","| Ceval-valid-metrology_engineer | | β | | 24 | acc |\n","| Ceval-valid-high_school_mathematics | | β | | 18 | acc |\n","| Ceval-valid-high_school_physics | | β | | 19 | acc |\n","| Ceval-valid-high_school_chemistry | | β | | 19 | acc |\n","| Ceval-valid-high_school_biology | | β | | 19 | acc |\n","| Ceval-valid-middle_school_mathematics | | β | | 19 | acc |\n","| Ceval-valid-middle_school_biology | | β | | 21 | acc |\n","| Ceval-valid-middle_school_physics | | β | | 19 | acc |\n","| Ceval-valid-middle_school_chemistry | | β | | 20 | acc |\n","| Ceval-valid-veterinary_medicine | | β | | 23 | acc |\n","| Ceval-valid-college_economics | | β | | 55 | acc |\n","| Ceval-valid-business_administration | | β | | 33 | acc |\n","| Ceval-valid-marxism | | β | | 19 | acc |\n","| Ceval-valid-mao_zedong_thought | | β | | 24 | acc |\n","| Ceval-valid-education_science | | β | | 29 | acc |\n","| Ceval-valid-teacher_qualification | | β | | 44 | acc |\n","| Ceval-valid-high_school_politics | | β | | 19 | acc |\n","| Ceval-valid-high_school_geography | | β | | 19 | acc |\n","| Ceval-valid-middle_school_politics | | β | | 21 | acc |\n","| Ceval-valid-middle_school_geography | | β | | 12 | acc |\n","| Ceval-valid-modern_chinese_history | | β | | 23 | acc |\n","| Ceval-valid-ideological_and_moral_cultivation | | β | | 19 | acc |\n","| Ceval-valid-logic | | β | | 22 | acc |\n","| Ceval-valid-law | | β | | 24 | acc |\n","| Ceval-valid-chinese_language_and_literature | | β | | 23 | acc |\n","| Ceval-valid-art_studies | | β | | 33 | acc |\n","| Ceval-valid-professional_tour_guide | | β | | 29 | acc |\n","| Ceval-valid-legal_professional | | β | | 23 | acc |\n","| Ceval-valid-high_school_chinese | | β | | 19 | acc |\n","| Ceval-valid-high_school_history | | β | | 20 | acc |\n","| Ceval-valid-middle_school_history | | β | | 22 | acc |\n","| Ceval-valid-civil_servant | | β | | 47 | acc |\n","| Ceval-valid-sports_science | | β | | 19 | acc |\n","| Ceval-valid-plant_protection | | β | | 22 | acc |\n","| Ceval-valid-basic_medicine | | β | | 19 | acc |\n","| Ceval-valid-clinical_medicine | | β | | 22 | acc |\n","| Ceval-valid-urban_and_rural_planner | | β | | 46 | acc |\n","| Ceval-valid-accountant | | β | | 49 | acc |\n","| Ceval-valid-fire_engineer | | β | | 31 | acc |\n","| Ceval-valid-environmental_impact_assessment_engineer | | β | | 31 | acc |\n","| Ceval-valid-tax_accountant | | β | | 49 | acc |\n","| Ceval-valid-physician | | β | | 49 | acc |"]}],"metadata":{"accelerator":"GPU","colab":{"authorship_tag":"ABX9TyOYjJFbDr/lKnnIcv2j6MLc","gpuType":"T4","machine_shape":"hm","provenance":[]},"kernelspec":{"display_name":"Python 3","name":"python3"},"language_info":{"name":"python"}},"nbformat":4,"nbformat_minor":0}
|