Dataset Preview
Full Screen
The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
The dataset generation failed
Error code:   DatasetGenerationError
Exception:    ArrowNotImplementedError
Message:      Cannot write struct type 'versions' with no child field to Parquet. Consider adding a dummy child field.
Traceback:    Traceback (most recent call last):
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1870, in _prepare_split_single
                  writer.write_table(table)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py", line 620, in write_table
                  self._build_writer(inferred_schema=pa_table.schema)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py", line 441, in _build_writer
                  self.pa_writer = self._WRITER_CLASS(self.stream, schema)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/pyarrow/parquet/core.py", line 1010, in __init__
                  self.writer = _parquet.ParquetWriter(
                File "pyarrow/_parquet.pyx", line 2157, in pyarrow._parquet.ParquetWriter.__cinit__
                File "pyarrow/error.pxi", line 154, in pyarrow.lib.pyarrow_internal_check_status
                File "pyarrow/error.pxi", line 91, in pyarrow.lib.check_status
              pyarrow.lib.ArrowNotImplementedError: Cannot write struct type 'versions' with no child field to Parquet. Consider adding a dummy child field.
              
              During handling of the above exception, another exception occurred:
              
              Traceback (most recent call last):
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1886, in _prepare_split_single
                  num_examples, num_bytes = writer.finalize()
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py", line 639, in finalize
                  self._build_writer(self.schema)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py", line 441, in _build_writer
                  self.pa_writer = self._WRITER_CLASS(self.stream, schema)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/pyarrow/parquet/core.py", line 1010, in __init__
                  self.writer = _parquet.ParquetWriter(
                File "pyarrow/_parquet.pyx", line 2157, in pyarrow._parquet.ParquetWriter.__cinit__
                File "pyarrow/error.pxi", line 154, in pyarrow.lib.pyarrow_internal_check_status
                File "pyarrow/error.pxi", line 91, in pyarrow.lib.check_status
              pyarrow.lib.ArrowNotImplementedError: Cannot write struct type 'versions' with no child field to Parquet. Consider adding a dummy child field.
              
              The above exception was the direct cause of the following exception:
              
              Traceback (most recent call last):
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1405, in compute_config_parquet_and_info_response
                  parquet_operations = convert_to_parquet(builder)
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1044, in convert_to_parquet
                  builder.download_and_prepare(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 924, in download_and_prepare
                  self._download_and_prepare(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1000, in _download_and_prepare
                  self._prepare_split(split_generator, **prepare_split_kwargs)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1741, in _prepare_split
                  for job_id, done, content in self._prepare_split_single(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1897, in _prepare_split_single
                  raise DatasetGenerationError("An error occurred while generating the dataset") from e
              datasets.exceptions.DatasetGenerationError: An error occurred while generating the dataset

Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.

config_general
dict
results
dict
versions
dict
config_tasks
dict
summary_tasks
dict
summary_general
dict
{ "model_name": "01-ai/Yi-1.5-34B-32K", "model_dtype": "float16", "model_size": 0 }
{ "harness-c_arc_challenge": { "acc_norm": 58.02, "acc_stderr": 0, "c_arc_challenge_25shot_acc": 53.5, "c_arc_challenge_25shot_acc_norm": 58.02 }, "harness-c_gsm8k": { "acc": 0, "acc_stderr": 0, "c_gsm8k_5shot_acc": 0 }, "harness-c_hellaswag": { "acc_norm": 67.48, "acc_stderr": 0, "c_hellaswag_10shot_acc": 49.28, "c_hellaswag_10shot_acc_norm": 67.48 }, "harness-c-sem-v2": { "acc": 88.8075, "acc_stderr": 0, "c_sem_v2-LLSRC_5shot_acc": 92.23, "c_sem_v2-SLPWC_5shot_acc": 83.57, "c_sem_v2-SLRFC_5shot_acc": 92.52, "c_sem_v2-SLSRC_5shot_acc": 86.91, "c_sem_v2-LLSRC_5shot_acc_norm": 92.23, "c_sem_v2-SLPWC_5shot_acc_norm": 83.57, "c_sem_v2-SLRFC_5shot_acc_norm": 92.52, "c_sem_v2-SLSRC_5shot_acc_norm": 86.91 }, "harness-c_truthfulqa_mc": { "mc2": 47.66, "acc_stderr": 0, "c_truthfulqa_mc_0shot_mc1": 26.32, "c_truthfulqa_mc_0shot_mc2": 47.66 }, "harness-c_winogrande": { "acc": 68.11, "acc_stderr": 0, "c_winogrande_0shot_acc": 68.11 }, "harness-cmmlu": { "acc_norm": 70.96, "acc_stderr": 0, "cmmlu-anatomy_5shot_acc": 61.48, "cmmlu_fullavg_5shot_acc": 70.96, "cmmlu-virology_5shot_acc": 52.41, "cmmlu-astronomy_5shot_acc": 82.24, "cmmlu-marketing_5shot_acc": 88.89, "cmmlu-nutrition_5shot_acc": 81.05, "cmmlu-sociology_5shot_acc": 85.07, "cmmlu-management_5shot_acc": 79.61, "cmmlu-philosophy_5shot_acc": 70.1, "cmmlu-prehistory_5shot_acc": 73.46, "cmmlu-human_aging_5shot_acc": 72.65, "cmmlu-econometrics_5shot_acc": 59.65, "cmmlu-formal_logic_5shot_acc": 48, "cmmlu-global_facts_5shot_acc": 48, "cmmlu-jurisprudence_5shot_acc": 83.33, "cmmlu-miscellaneous_5shot_acc": 80.08, "cmmlu-moral_disputes_5shot_acc": 71.39, "cmmlu-business_ethics_5shot_acc": 76, "cmmlu-college_biology_5shot_acc": 69.44, "cmmlu-college_physics_5shot_acc": 55.88, "cmmlu-human_sexuality_5shot_acc": 74.05, "cmmlu-moral_scenarios_5shot_acc": 51.51, "cmmlu-world_religions_5shot_acc": 74.85, "cmmlu-abstract_algebra_5shot_acc": 47, "cmmlu-college_medicine_5shot_acc": 73.99, "cmmlu-machine_learning_5shot_acc": 54.46, "cmmlu-medical_genetics_5shot_acc": 73, "cmmlu-professional_law_5shot_acc": 51.96, "cmmlu-public_relations_5shot_acc": 70, "cmmlu-security_studies_5shot_acc": 77.55, "cmmlu-college_chemistry_5shot_acc": 56, "cmmlu-computer_security_5shot_acc": 77, "cmmlu-international_law_5shot_acc": 90.91, "cmmlu-logical_fallacies_5shot_acc": 71.17, "cmmlu-us_foreign_policy_5shot_acc": 85, "cmmlu-clinical_knowledge_5shot_acc": 75.09, "cmmlu-conceptual_physics_5shot_acc": 75.74, "cmmlu-college_mathematics_5shot_acc": 45, "cmmlu-high_school_biology_5shot_acc": 83.87, "cmmlu-high_school_physics_5shot_acc": 53.64, "cmmlu-high_school_chemistry_5shot_acc": 61.58, "cmmlu-high_school_geography_5shot_acc": 81.31, "cmmlu-professional_medicine_5shot_acc": 70.96, "cmmlu-electrical_engineering_5shot_acc": 73.1, "cmmlu-elementary_mathematics_5shot_acc": 69.58, "cmmlu-high_school_psychology_5shot_acc": 83.12, "cmmlu-high_school_statistics_5shot_acc": 68.52, "cmmlu-high_school_us_history_5shot_acc": 87.75, "cmmlu-high_school_mathematics_5shot_acc": 49.26, "cmmlu-professional_accounting_5shot_acc": 56.38, "cmmlu-professional_psychology_5shot_acc": 70.1, "cmmlu-college_computer_science_5shot_acc": 67, "cmmlu-high_school_world_history_5shot_acc": 84.81, "cmmlu-high_school_macroeconomics_5shot_acc": 80, "cmmlu-high_school_microeconomics_5shot_acc": 86.55, "cmmlu-high_school_computer_science_5shot_acc": 83, "cmmlu-high_school_european_history_5shot_acc": 81.21, "cmmlu-high_school_government_and_politics_5shot_acc": 90.16, "cmmlu-anatomy_5shot_acc_norm": 61.48, "cmmlu_fullavg_5shot_acc_norm": 70.96, "cmmlu-virology_5shot_acc_norm": 52.41, "cmmlu-astronomy_5shot_acc_norm": 82.24, "cmmlu-marketing_5shot_acc_norm": 88.89, "cmmlu-nutrition_5shot_acc_norm": 81.05, "cmmlu-sociology_5shot_acc_norm": 85.07, "cmmlu-management_5shot_acc_norm": 79.61, "cmmlu-philosophy_5shot_acc_norm": 70.1, "cmmlu-prehistory_5shot_acc_norm": 73.46, "cmmlu-human_aging_5shot_acc_norm": 72.65, "cmmlu-econometrics_5shot_acc_norm": 59.65, "cmmlu-formal_logic_5shot_acc_norm": 48, "cmmlu-global_facts_5shot_acc_norm": 48, "cmmlu-jurisprudence_5shot_acc_norm": 83.33, "cmmlu-miscellaneous_5shot_acc_norm": 80.08, "cmmlu-moral_disputes_5shot_acc_norm": 71.39, "cmmlu-business_ethics_5shot_acc_norm": 76, "cmmlu-college_biology_5shot_acc_norm": 69.44, "cmmlu-college_physics_5shot_acc_norm": 55.88, "cmmlu-human_sexuality_5shot_acc_norm": 74.05, "cmmlu-moral_scenarios_5shot_acc_norm": 51.51, "cmmlu-world_religions_5shot_acc_norm": 74.85, "cmmlu-abstract_algebra_5shot_acc_norm": 47, "cmmlu-college_medicine_5shot_acc_norm": 73.99, "cmmlu-machine_learning_5shot_acc_norm": 54.46, "cmmlu-medical_genetics_5shot_acc_norm": 73, "cmmlu-professional_law_5shot_acc_norm": 51.96, "cmmlu-public_relations_5shot_acc_norm": 70, "cmmlu-security_studies_5shot_acc_norm": 77.55, "cmmlu-college_chemistry_5shot_acc_norm": 56, "cmmlu-computer_security_5shot_acc_norm": 77, "cmmlu-international_law_5shot_acc_norm": 90.91, "cmmlu-logical_fallacies_5shot_acc_norm": 71.17, "cmmlu-us_foreign_policy_5shot_acc_norm": 85, "cmmlu-clinical_knowledge_5shot_acc_norm": 75.09, "cmmlu-conceptual_physics_5shot_acc_norm": 75.74, "cmmlu-college_mathematics_5shot_acc_norm": 45, "cmmlu-high_school_biology_5shot_acc_norm": 83.87, "cmmlu-high_school_physics_5shot_acc_norm": 53.64, "cmmlu-high_school_chemistry_5shot_acc_norm": 61.58, "cmmlu-high_school_geography_5shot_acc_norm": 81.31, "cmmlu-professional_medicine_5shot_acc_norm": 70.96, "cmmlu-electrical_engineering_5shot_acc_norm": 73.1, "cmmlu-elementary_mathematics_5shot_acc_norm": 69.58, "cmmlu-high_school_psychology_5shot_acc_norm": 83.12, "cmmlu-high_school_statistics_5shot_acc_norm": 68.52, "cmmlu-high_school_us_history_5shot_acc_norm": 87.75, "cmmlu-high_school_mathematics_5shot_acc_norm": 49.26, "cmmlu-professional_accounting_5shot_acc_norm": 56.38, "cmmlu-professional_psychology_5shot_acc_norm": 70.1, "cmmlu-college_computer_science_5shot_acc_norm": 67, "cmmlu-high_school_world_history_5shot_acc_norm": 84.81, "cmmlu-high_school_macroeconomics_5shot_acc_norm": 80, "cmmlu-high_school_microeconomics_5shot_acc_norm": 86.55, "cmmlu-high_school_computer_science_5shot_acc_norm": 83, "cmmlu-high_school_european_history_5shot_acc_norm": 81.21, "cmmlu-high_school_government_and_politics_5shot_acc_norm": 90.16 } }
{}
{}
{}
{}
{ "model_name": "01-ai/Yi-1.5-34B-Chat-16K", "model_dtype": "float16", "model_size": 0 }
{ "harness-c_arc_challenge": { "acc_norm": 63.74, "acc_stderr": 0, "c_arc_challenge_25shot_acc": 57.68, "c_arc_challenge_25shot_acc_norm": 63.74 }, "harness-c_gsm8k": { "acc": 66.72, "acc_stderr": 0, "c_gsm8k_5shot_acc": 66.72 }, "harness-c_hellaswag": { "acc_norm": 69.35, "acc_stderr": 0, "c_hellaswag_10shot_acc": 50.67, "c_hellaswag_10shot_acc_norm": 69.35 }, "harness-c-sem-v2": { "acc": 90.095, "acc_stderr": 0, "c_sem_v2-LLSRC_5shot_acc": 93.81, "c_sem_v2-SLPWC_5shot_acc": 85.71, "c_sem_v2-SLRFC_5shot_acc": 93.09, "c_sem_v2-SLSRC_5shot_acc": 87.77, "c_sem_v2-LLSRC_5shot_acc_norm": 93.81, "c_sem_v2-SLPWC_5shot_acc_norm": 85.71, "c_sem_v2-SLRFC_5shot_acc_norm": 93.09, "c_sem_v2-SLSRC_5shot_acc_norm": 87.77 }, "harness-c_truthfulqa_mc": { "mc2": 54.54, "acc_stderr": 0, "c_truthfulqa_mc_0shot_mc1": 32.93, "c_truthfulqa_mc_0shot_mc2": 54.54 }, "harness-c_winogrande": { "acc": 69.53, "acc_stderr": 0, "c_winogrande_0shot_acc": 69.53 }, "CLCC-H": { "acc": 0.5127, "acc_stderr": 0 }, "harness-cmmlu": { "acc_norm": 70.03, "acc_stderr": 0, "cmmlu_fullavg_5shot_acc": 70.03, "cmmlu-virology_5shot_acc": 52.41, "cmmlu-nutrition_5shot_acc": 80.07, "cmmlu-sociology_5shot_acc": 82.59, "cmmlu-philosophy_5shot_acc": 68.81, "cmmlu-prehistory_5shot_acc": 72.53, "cmmlu-moral_disputes_5shot_acc": 71.97, "cmmlu-moral_scenarios_5shot_acc": 60.56, "cmmlu-world_religions_5shot_acc": 75.44, "cmmlu-professional_law_5shot_acc": 52.61, "cmmlu-public_relations_5shot_acc": 71.82, "cmmlu-security_studies_5shot_acc": 75.51, "cmmlu-us_foreign_policy_5shot_acc": 86, "cmmlu-professional_medicine_5shot_acc": 72.43, "cmmlu-professional_accounting_5shot_acc": 57.09, "cmmlu-professional_psychology_5shot_acc": 70.59, "cmmlu_fullavg_5shot_acc_norm": 70.03, "cmmlu-virology_5shot_acc_norm": 52.41, "cmmlu-nutrition_5shot_acc_norm": 80.07, "cmmlu-sociology_5shot_acc_norm": 82.59, "cmmlu-philosophy_5shot_acc_norm": 68.81, "cmmlu-prehistory_5shot_acc_norm": 72.53, "cmmlu-moral_disputes_5shot_acc_norm": 71.97, "cmmlu-moral_scenarios_5shot_acc_norm": 60.56, "cmmlu-world_religions_5shot_acc_norm": 75.44, "cmmlu-professional_law_5shot_acc_norm": 52.61, "cmmlu-public_relations_5shot_acc_norm": 71.82, "cmmlu-security_studies_5shot_acc_norm": 75.51, "cmmlu-us_foreign_policy_5shot_acc_norm": 86, "cmmlu-professional_medicine_5shot_acc_norm": 72.43, "cmmlu-professional_accounting_5shot_acc_norm": 57.09, "cmmlu-professional_psychology_5shot_acc_norm": 70.59 } }
{}
{}
{}
{}
{ "model_name": "01-ai/Yi-1.5-34B-Chat", "model_dtype": "float16", "model_size": 0 }
{ "harness-c_arc_challenge": { "acc_norm": 61.6, "acc_stderr": 0, "c_arc_challenge_25shot_acc": 56.66, "c_arc_challenge_25shot_acc_norm": 61.6 }, "harness-c_gsm8k": { "acc": 68.31, "acc_stderr": 0, "c_gsm8k_5shot_acc": 68.31 }, "harness-c_hellaswag": { "acc_norm": 66.95, "acc_stderr": 0, "c_hellaswag_10shot_acc": 49.69, "c_hellaswag_10shot_acc_norm": 66.95 }, "harness-c-sem-v2": { "acc": 87.4375, "acc_stderr": 0, "c_sem_v2-LLSRC_5shot_acc": 91.08, "c_sem_v2-SLPWC_5shot_acc": 84.14, "c_sem_v2-SLRFC_5shot_acc": 91.22, "c_sem_v2-SLSRC_5shot_acc": 83.31, "c_sem_v2-LLSRC_5shot_acc_norm": 91.08, "c_sem_v2-SLPWC_5shot_acc_norm": 84.14, "c_sem_v2-SLRFC_5shot_acc_norm": 91.22, "c_sem_v2-SLSRC_5shot_acc_norm": 83.31 }, "harness-c_truthfulqa_mc": { "mc2": 56.28, "acc_stderr": 0, "c_truthfulqa_mc_0shot_mc1": 33.29, "c_truthfulqa_mc_0shot_mc2": 56.28 }, "harness-c_winogrande": { "acc": 68.27, "acc_stderr": 0, "c_winogrande_0shot_acc": 68.27 }, "CLCC-H": { "acc": 0.742, "acc_stderr": 0 }, "harness-cmmlu": { "acc_norm": 68.31, "acc_stderr": 0, "cmmlu-anatomy_5shot_acc": 65.19, "cmmlu_fullavg_5shot_acc": 68.31, "cmmlu-virology_5shot_acc": 53.01, "cmmlu-astronomy_5shot_acc": 80.92, "cmmlu-marketing_5shot_acc": 86.32, "cmmlu-nutrition_5shot_acc": 75.82, "cmmlu-sociology_5shot_acc": 80.6, "cmmlu-management_5shot_acc": 76.7, "cmmlu-philosophy_5shot_acc": 69.77, "cmmlu-prehistory_5shot_acc": 72.22, "cmmlu-human_aging_5shot_acc": 71.3, "cmmlu-econometrics_5shot_acc": 51.75, "cmmlu-formal_logic_5shot_acc": 44.8, "cmmlu-global_facts_5shot_acc": 50, "cmmlu-jurisprudence_5shot_acc": 76.85, "cmmlu-miscellaneous_5shot_acc": 76.76, "cmmlu-moral_disputes_5shot_acc": 67.92, "cmmlu-business_ethics_5shot_acc": 68, "cmmlu-college_biology_5shot_acc": 70.83, "cmmlu-college_physics_5shot_acc": 55.88, "cmmlu-human_sexuality_5shot_acc": 68.7, "cmmlu-moral_scenarios_5shot_acc": 50.28, "cmmlu-world_religions_5shot_acc": 76.02, "cmmlu-abstract_algebra_5shot_acc": 49, "cmmlu-college_medicine_5shot_acc": 70.52, "cmmlu-machine_learning_5shot_acc": 47.32, "cmmlu-medical_genetics_5shot_acc": 75, "cmmlu-professional_law_5shot_acc": 49.22, "cmmlu-public_relations_5shot_acc": 66.36, "cmmlu-security_studies_5shot_acc": 73.47, "cmmlu-college_chemistry_5shot_acc": 53, "cmmlu-computer_security_5shot_acc": 76, "cmmlu-international_law_5shot_acc": 83.47, "cmmlu-logical_fallacies_5shot_acc": 65.03, "cmmlu-us_foreign_policy_5shot_acc": 85, "cmmlu-clinical_knowledge_5shot_acc": 69.43, "cmmlu-conceptual_physics_5shot_acc": 71.49, "cmmlu-college_mathematics_5shot_acc": 44, "cmmlu-high_school_biology_5shot_acc": 81.61, "cmmlu-high_school_physics_5shot_acc": 56.29, "cmmlu-high_school_chemistry_5shot_acc": 58.62, "cmmlu-high_school_geography_5shot_acc": 79.29, "cmmlu-professional_medicine_5shot_acc": 71.32, "cmmlu-electrical_engineering_5shot_acc": 64.83, "cmmlu-elementary_mathematics_5shot_acc": 68.25, "cmmlu-high_school_psychology_5shot_acc": 82.02, "cmmlu-high_school_statistics_5shot_acc": 65.28, "cmmlu-high_school_us_history_5shot_acc": 80.88, "cmmlu-high_school_mathematics_5shot_acc": 50, "cmmlu-professional_accounting_5shot_acc": 52.84, "cmmlu-professional_psychology_5shot_acc": 68.3, "cmmlu-college_computer_science_5shot_acc": 60, "cmmlu-high_school_world_history_5shot_acc": 84.39, "cmmlu-high_school_macroeconomics_5shot_acc": 76.92, "cmmlu-high_school_microeconomics_5shot_acc": 84.45, "cmmlu-high_school_computer_science_5shot_acc": 82, "cmmlu-high_school_european_history_5shot_acc": 75.76, "cmmlu-high_school_government_and_politics_5shot_acc": 82.9, "cmmlu-anatomy_5shot_acc_norm": 65.19, "cmmlu_fullavg_5shot_acc_norm": 68.31, "cmmlu-virology_5shot_acc_norm": 53.01, "cmmlu-astronomy_5shot_acc_norm": 80.92, "cmmlu-marketing_5shot_acc_norm": 86.32, "cmmlu-nutrition_5shot_acc_norm": 75.82, "cmmlu-sociology_5shot_acc_norm": 80.6, "cmmlu-management_5shot_acc_norm": 76.7, "cmmlu-philosophy_5shot_acc_norm": 69.77, "cmmlu-prehistory_5shot_acc_norm": 72.22, "cmmlu-human_aging_5shot_acc_norm": 71.3, "cmmlu-econometrics_5shot_acc_norm": 51.75, "cmmlu-formal_logic_5shot_acc_norm": 44.8, "cmmlu-global_facts_5shot_acc_norm": 50, "cmmlu-jurisprudence_5shot_acc_norm": 76.85, "cmmlu-miscellaneous_5shot_acc_norm": 76.76, "cmmlu-moral_disputes_5shot_acc_norm": 67.92, "cmmlu-business_ethics_5shot_acc_norm": 68, "cmmlu-college_biology_5shot_acc_norm": 70.83, "cmmlu-college_physics_5shot_acc_norm": 55.88, "cmmlu-human_sexuality_5shot_acc_norm": 68.7, "cmmlu-moral_scenarios_5shot_acc_norm": 50.28, "cmmlu-world_religions_5shot_acc_norm": 76.02, "cmmlu-abstract_algebra_5shot_acc_norm": 49, "cmmlu-college_medicine_5shot_acc_norm": 70.52, "cmmlu-machine_learning_5shot_acc_norm": 47.32, "cmmlu-medical_genetics_5shot_acc_norm": 75, "cmmlu-professional_law_5shot_acc_norm": 49.22, "cmmlu-public_relations_5shot_acc_norm": 66.36, "cmmlu-security_studies_5shot_acc_norm": 73.47, "cmmlu-college_chemistry_5shot_acc_norm": 53, "cmmlu-computer_security_5shot_acc_norm": 76, "cmmlu-international_law_5shot_acc_norm": 83.47, "cmmlu-logical_fallacies_5shot_acc_norm": 65.03, "cmmlu-us_foreign_policy_5shot_acc_norm": 85, "cmmlu-clinical_knowledge_5shot_acc_norm": 69.43, "cmmlu-conceptual_physics_5shot_acc_norm": 71.49, "cmmlu-college_mathematics_5shot_acc_norm": 44, "cmmlu-high_school_biology_5shot_acc_norm": 81.61, "cmmlu-high_school_physics_5shot_acc_norm": 56.29, "cmmlu-high_school_chemistry_5shot_acc_norm": 58.62, "cmmlu-high_school_geography_5shot_acc_norm": 79.29, "cmmlu-professional_medicine_5shot_acc_norm": 71.32, "cmmlu-electrical_engineering_5shot_acc_norm": 64.83, "cmmlu-elementary_mathematics_5shot_acc_norm": 68.25, "cmmlu-high_school_psychology_5shot_acc_norm": 82.02, "cmmlu-high_school_statistics_5shot_acc_norm": 65.28, "cmmlu-high_school_us_history_5shot_acc_norm": 80.88, "cmmlu-high_school_mathematics_5shot_acc_norm": 50, "cmmlu-professional_accounting_5shot_acc_norm": 52.84, "cmmlu-professional_psychology_5shot_acc_norm": 68.3, "cmmlu-college_computer_science_5shot_acc_norm": 60, "cmmlu-high_school_world_history_5shot_acc_norm": 84.39, "cmmlu-high_school_macroeconomics_5shot_acc_norm": 76.92, "cmmlu-high_school_microeconomics_5shot_acc_norm": 84.45, "cmmlu-high_school_computer_science_5shot_acc_norm": 82, "cmmlu-high_school_european_history_5shot_acc_norm": 75.76, "cmmlu-high_school_government_and_politics_5shot_acc_norm": 82.9 } }
{}
{}
{}
{}
{ "model_name": "01-ai/Yi-1.5-34B", "model_dtype": "float16", "model_size": 0 }
{ "harness-c_arc_challenge": { "acc_norm": 58.62, "acc_stderr": 0, "c_arc_challenge_25shot_acc": 53.92, "c_arc_challenge_25shot_acc_norm": 58.62 }, "harness-c_gsm8k": { "acc": 61.49, "acc_stderr": 0, "c_gsm8k_5shot_acc": 61.49 }, "harness-c_hellaswag": { "acc_norm": 68.68, "acc_stderr": 0, "c_hellaswag_10shot_acc": 50.18, "c_hellaswag_10shot_acc_norm": 68.68 }, "harness-c-sem-v2": { "acc": 88.16, "acc_stderr": 0, "c_sem_v2-LLSRC_5shot_acc": 91.51, "c_sem_v2-SLPWC_5shot_acc": 82.43, "c_sem_v2-SLRFC_5shot_acc": 92.23, "c_sem_v2-SLSRC_5shot_acc": 86.47, "c_sem_v2-LLSRC_5shot_acc_norm": 91.51, "c_sem_v2-SLPWC_5shot_acc_norm": 82.43, "c_sem_v2-SLRFC_5shot_acc_norm": 92.23, "c_sem_v2-SLSRC_5shot_acc_norm": 86.47 }, "harness-c_truthfulqa_mc": { "mc2": 50.7, "acc_stderr": 0, "c_truthfulqa_mc_0shot_mc1": 28.76, "c_truthfulqa_mc_0shot_mc2": 50.7 }, "harness-c_winogrande": { "acc": 68.59, "acc_stderr": 0, "c_winogrande_0shot_acc": 68.59 }, "harness-cmmlu": { "acc_norm": 70.56, "acc_stderr": 0, "cmmlu-anatomy_5shot_acc": 61.48, "cmmlu_fullavg_5shot_acc": 70.56, "cmmlu-virology_5shot_acc": 49.4, "cmmlu-astronomy_5shot_acc": 80.26, "cmmlu-marketing_5shot_acc": 87.61, "cmmlu-nutrition_5shot_acc": 79.08, "cmmlu-sociology_5shot_acc": 85.07, "cmmlu-management_5shot_acc": 83.5, "cmmlu-philosophy_5shot_acc": 71.38, "cmmlu-prehistory_5shot_acc": 72.53, "cmmlu-human_aging_5shot_acc": 69.06, "cmmlu-econometrics_5shot_acc": 55.26, "cmmlu-formal_logic_5shot_acc": 49.6, "cmmlu-global_facts_5shot_acc": 54, "cmmlu-jurisprudence_5shot_acc": 81.48, "cmmlu-miscellaneous_5shot_acc": 78.93, "cmmlu-moral_disputes_5shot_acc": 72.54, "cmmlu-business_ethics_5shot_acc": 78, "cmmlu-college_biology_5shot_acc": 72.92, "cmmlu-college_physics_5shot_acc": 54.9, "cmmlu-human_sexuality_5shot_acc": 77.1, "cmmlu-moral_scenarios_5shot_acc": 45.03, "cmmlu-world_religions_5shot_acc": 70.76, "cmmlu-abstract_algebra_5shot_acc": 42, "cmmlu-college_medicine_5shot_acc": 77.46, "cmmlu-machine_learning_5shot_acc": 53.57, "cmmlu-medical_genetics_5shot_acc": 72, "cmmlu-professional_law_5shot_acc": 52.22, "cmmlu-public_relations_5shot_acc": 66.36, "cmmlu-security_studies_5shot_acc": 77.14, "cmmlu-college_chemistry_5shot_acc": 54, "cmmlu-computer_security_5shot_acc": 76, "cmmlu-international_law_5shot_acc": 90.91, "cmmlu-logical_fallacies_5shot_acc": 71.78, "cmmlu-us_foreign_policy_5shot_acc": 86, "cmmlu-clinical_knowledge_5shot_acc": 75.85, "cmmlu-conceptual_physics_5shot_acc": 72.77, "cmmlu-college_mathematics_5shot_acc": 44, "cmmlu-high_school_biology_5shot_acc": 84.84, "cmmlu-high_school_physics_5shot_acc": 54.97, "cmmlu-high_school_chemistry_5shot_acc": 63.05, "cmmlu-high_school_geography_5shot_acc": 82.83, "cmmlu-professional_medicine_5shot_acc": 74.26, "cmmlu-electrical_engineering_5shot_acc": 67.59, "cmmlu-elementary_mathematics_5shot_acc": 71.16, "cmmlu-high_school_psychology_5shot_acc": 83.12, "cmmlu-high_school_statistics_5shot_acc": 67.13, "cmmlu-high_school_us_history_5shot_acc": 86.76, "cmmlu-high_school_mathematics_5shot_acc": 47.41, "cmmlu-professional_accounting_5shot_acc": 56.38, "cmmlu-professional_psychology_5shot_acc": 69.12, "cmmlu-college_computer_science_5shot_acc": 64, "cmmlu-high_school_world_history_5shot_acc": 86.5, "cmmlu-high_school_macroeconomics_5shot_acc": 79.49, "cmmlu-high_school_microeconomics_5shot_acc": 84.87, "cmmlu-high_school_computer_science_5shot_acc": 83, "cmmlu-high_school_european_history_5shot_acc": 82.42, "cmmlu-high_school_government_and_politics_5shot_acc": 91.19, "cmmlu-anatomy_5shot_acc_norm": 61.48, "cmmlu_fullavg_5shot_acc_norm": 70.56, "cmmlu-virology_5shot_acc_norm": 49.4, "cmmlu-astronomy_5shot_acc_norm": 80.26, "cmmlu-marketing_5shot_acc_norm": 87.61, "cmmlu-nutrition_5shot_acc_norm": 79.08, "cmmlu-sociology_5shot_acc_norm": 85.07, "cmmlu-management_5shot_acc_norm": 83.5, "cmmlu-philosophy_5shot_acc_norm": 71.38, "cmmlu-prehistory_5shot_acc_norm": 72.53, "cmmlu-human_aging_5shot_acc_norm": 69.06, "cmmlu-econometrics_5shot_acc_norm": 55.26, "cmmlu-formal_logic_5shot_acc_norm": 49.6, "cmmlu-global_facts_5shot_acc_norm": 54, "cmmlu-jurisprudence_5shot_acc_norm": 81.48, "cmmlu-miscellaneous_5shot_acc_norm": 78.93, "cmmlu-moral_disputes_5shot_acc_norm": 72.54, "cmmlu-business_ethics_5shot_acc_norm": 78, "cmmlu-college_biology_5shot_acc_norm": 72.92, "cmmlu-college_physics_5shot_acc_norm": 54.9, "cmmlu-human_sexuality_5shot_acc_norm": 77.1, "cmmlu-moral_scenarios_5shot_acc_norm": 45.03, "cmmlu-world_religions_5shot_acc_norm": 70.76, "cmmlu-abstract_algebra_5shot_acc_norm": 42, "cmmlu-college_medicine_5shot_acc_norm": 77.46, "cmmlu-machine_learning_5shot_acc_norm": 53.57, "cmmlu-medical_genetics_5shot_acc_norm": 72, "cmmlu-professional_law_5shot_acc_norm": 52.22, "cmmlu-public_relations_5shot_acc_norm": 66.36, "cmmlu-security_studies_5shot_acc_norm": 77.14, "cmmlu-college_chemistry_5shot_acc_norm": 54, "cmmlu-computer_security_5shot_acc_norm": 76, "cmmlu-international_law_5shot_acc_norm": 90.91, "cmmlu-logical_fallacies_5shot_acc_norm": 71.78, "cmmlu-us_foreign_policy_5shot_acc_norm": 86, "cmmlu-clinical_knowledge_5shot_acc_norm": 75.85, "cmmlu-conceptual_physics_5shot_acc_norm": 72.77, "cmmlu-college_mathematics_5shot_acc_norm": 44, "cmmlu-high_school_biology_5shot_acc_norm": 84.84, "cmmlu-high_school_physics_5shot_acc_norm": 54.97, "cmmlu-high_school_chemistry_5shot_acc_norm": 63.05, "cmmlu-high_school_geography_5shot_acc_norm": 82.83, "cmmlu-professional_medicine_5shot_acc_norm": 74.26, "cmmlu-electrical_engineering_5shot_acc_norm": 67.59, "cmmlu-elementary_mathematics_5shot_acc_norm": 71.16, "cmmlu-high_school_psychology_5shot_acc_norm": 83.12, "cmmlu-high_school_statistics_5shot_acc_norm": 67.13, "cmmlu-high_school_us_history_5shot_acc_norm": 86.76, "cmmlu-high_school_mathematics_5shot_acc_norm": 47.41, "cmmlu-professional_accounting_5shot_acc_norm": 56.38, "cmmlu-professional_psychology_5shot_acc_norm": 69.12, "cmmlu-college_computer_science_5shot_acc_norm": 64, "cmmlu-high_school_world_history_5shot_acc_norm": 86.5, "cmmlu-high_school_macroeconomics_5shot_acc_norm": 79.49, "cmmlu-high_school_microeconomics_5shot_acc_norm": 84.87, "cmmlu-high_school_computer_science_5shot_acc_norm": 83, "cmmlu-high_school_european_history_5shot_acc_norm": 82.42, "cmmlu-high_school_government_and_politics_5shot_acc_norm": 91.19 } }
{}
{}
{}
{}
{ "model_name": "01-ai/Yi-1.5-6B", "model_dtype": "float16", "model_size": 0 }
{ "harness-c_arc_challenge": { "acc_norm": 49.32, "acc_stderr": 0, "c_arc_challenge_25shot_acc": 43.43, "c_arc_challenge_25shot_acc_norm": 49.32 }, "harness-c_gsm8k": { "acc": 39.58, "acc_stderr": 0, "c_gsm8k_5shot_acc": 39.58 }, "harness-c_hellaswag": { "acc_norm": 60.24, "acc_stderr": 0, "c_hellaswag_10shot_acc": 43.92, "c_hellaswag_10shot_acc_norm": 60.24 }, "harness-c-sem-v2": { "acc": 72.1975, "acc_stderr": 0, "c_sem_v2-LLSRC_5shot_acc": 78.13, "c_sem_v2-SLPWC_5shot_acc": 59, "c_sem_v2-SLRFC_5shot_acc": 74.68, "c_sem_v2-SLSRC_5shot_acc": 76.98, "c_sem_v2-LLSRC_5shot_acc_norm": 78.13, "c_sem_v2-SLPWC_5shot_acc_norm": 59, "c_sem_v2-SLRFC_5shot_acc_norm": 74.68, "c_sem_v2-SLSRC_5shot_acc_norm": 76.98 }, "harness-c_truthfulqa_mc": { "mc2": 48.34, "acc_stderr": 0, "c_truthfulqa_mc_0shot_mc1": 27.17, "c_truthfulqa_mc_0shot_mc2": 48.34 }, "harness-c_winogrande": { "acc": 65.04, "acc_stderr": 0, "c_winogrande_0shot_acc": 65.04 }, "harness-cmmlu": { "acc_norm": 57.54, "acc_stderr": 0, "cmmlu-anatomy_5shot_acc": 48.15, "cmmlu_fullavg_5shot_acc": 57.54, "cmmlu-virology_5shot_acc": 43.37, "cmmlu-astronomy_5shot_acc": 59.21, "cmmlu-marketing_5shot_acc": 79.91, "cmmlu-nutrition_5shot_acc": 65.36, "cmmlu-sociology_5shot_acc": 73.13, "cmmlu-management_5shot_acc": 69.9, "cmmlu-philosophy_5shot_acc": 61.41, "cmmlu-prehistory_5shot_acc": 57.72, "cmmlu-human_aging_5shot_acc": 64.57, "cmmlu-econometrics_5shot_acc": 42.98, "cmmlu-formal_logic_5shot_acc": 40, "cmmlu-global_facts_5shot_acc": 40, "cmmlu-jurisprudence_5shot_acc": 73.15, "cmmlu-miscellaneous_5shot_acc": 64.5, "cmmlu-moral_disputes_5shot_acc": 61.85, "cmmlu-business_ethics_5shot_acc": 59, "cmmlu-college_biology_5shot_acc": 50.69, "cmmlu-college_physics_5shot_acc": 41.18, "cmmlu-human_sexuality_5shot_acc": 65.65, "cmmlu-moral_scenarios_5shot_acc": 29.39, "cmmlu-world_religions_5shot_acc": 59.06, "cmmlu-abstract_algebra_5shot_acc": 27, "cmmlu-college_medicine_5shot_acc": 58.38, "cmmlu-machine_learning_5shot_acc": 47.32, "cmmlu-medical_genetics_5shot_acc": 62, "cmmlu-professional_law_5shot_acc": 39.18, "cmmlu-public_relations_5shot_acc": 56.36, "cmmlu-security_studies_5shot_acc": 67.76, "cmmlu-college_chemistry_5shot_acc": 50, "cmmlu-computer_security_5shot_acc": 69, "cmmlu-international_law_5shot_acc": 78.51, "cmmlu-logical_fallacies_5shot_acc": 53.99, "cmmlu-us_foreign_policy_5shot_acc": 82, "cmmlu-clinical_knowledge_5shot_acc": 62.26, "cmmlu-conceptual_physics_5shot_acc": 54.47, "cmmlu-college_mathematics_5shot_acc": 43, "cmmlu-high_school_biology_5shot_acc": 68.06, "cmmlu-high_school_physics_5shot_acc": 39.74, "cmmlu-high_school_chemistry_5shot_acc": 52.71, "cmmlu-high_school_geography_5shot_acc": 69.19, "cmmlu-professional_medicine_5shot_acc": 51.84, "cmmlu-electrical_engineering_5shot_acc": 58.62, "cmmlu-elementary_mathematics_5shot_acc": 44.97, "cmmlu-high_school_psychology_5shot_acc": 70.46, "cmmlu-high_school_statistics_5shot_acc": 50, "cmmlu-high_school_us_history_5shot_acc": 70.1, "cmmlu-high_school_mathematics_5shot_acc": 35.93, "cmmlu-professional_accounting_5shot_acc": 46.1, "cmmlu-professional_psychology_5shot_acc": 58.17, "cmmlu-college_computer_science_5shot_acc": 51, "cmmlu-high_school_world_history_5shot_acc": 73.84, "cmmlu-high_school_macroeconomics_5shot_acc": 61.03, "cmmlu-high_school_microeconomics_5shot_acc": 69.75, "cmmlu-high_school_computer_science_5shot_acc": 63, "cmmlu-high_school_european_history_5shot_acc": 70.3, "cmmlu-high_school_government_and_politics_5shot_acc": 73.58, "cmmlu-anatomy_5shot_acc_norm": 48.15, "cmmlu_fullavg_5shot_acc_norm": 57.54, "cmmlu-virology_5shot_acc_norm": 43.37, "cmmlu-astronomy_5shot_acc_norm": 59.21, "cmmlu-marketing_5shot_acc_norm": 79.91, "cmmlu-nutrition_5shot_acc_norm": 65.36, "cmmlu-sociology_5shot_acc_norm": 73.13, "cmmlu-management_5shot_acc_norm": 69.9, "cmmlu-philosophy_5shot_acc_norm": 61.41, "cmmlu-prehistory_5shot_acc_norm": 57.72, "cmmlu-human_aging_5shot_acc_norm": 64.57, "cmmlu-econometrics_5shot_acc_norm": 42.98, "cmmlu-formal_logic_5shot_acc_norm": 40, "cmmlu-global_facts_5shot_acc_norm": 40, "cmmlu-jurisprudence_5shot_acc_norm": 73.15, "cmmlu-miscellaneous_5shot_acc_norm": 64.5, "cmmlu-moral_disputes_5shot_acc_norm": 61.85, "cmmlu-business_ethics_5shot_acc_norm": 59, "cmmlu-college_biology_5shot_acc_norm": 50.69, "cmmlu-college_physics_5shot_acc_norm": 41.18, "cmmlu-human_sexuality_5shot_acc_norm": 65.65, "cmmlu-moral_scenarios_5shot_acc_norm": 29.39, "cmmlu-world_religions_5shot_acc_norm": 59.06, "cmmlu-abstract_algebra_5shot_acc_norm": 27, "cmmlu-college_medicine_5shot_acc_norm": 58.38, "cmmlu-machine_learning_5shot_acc_norm": 47.32, "cmmlu-medical_genetics_5shot_acc_norm": 62, "cmmlu-professional_law_5shot_acc_norm": 39.18, "cmmlu-public_relations_5shot_acc_norm": 56.36, "cmmlu-security_studies_5shot_acc_norm": 67.76, "cmmlu-college_chemistry_5shot_acc_norm": 50, "cmmlu-computer_security_5shot_acc_norm": 69, "cmmlu-international_law_5shot_acc_norm": 78.51, "cmmlu-logical_fallacies_5shot_acc_norm": 53.99, "cmmlu-us_foreign_policy_5shot_acc_norm": 82, "cmmlu-clinical_knowledge_5shot_acc_norm": 62.26, "cmmlu-conceptual_physics_5shot_acc_norm": 54.47, "cmmlu-college_mathematics_5shot_acc_norm": 43, "cmmlu-high_school_biology_5shot_acc_norm": 68.06, "cmmlu-high_school_physics_5shot_acc_norm": 39.74, "cmmlu-high_school_chemistry_5shot_acc_norm": 52.71, "cmmlu-high_school_geography_5shot_acc_norm": 69.19, "cmmlu-professional_medicine_5shot_acc_norm": 51.84, "cmmlu-electrical_engineering_5shot_acc_norm": 58.62, "cmmlu-elementary_mathematics_5shot_acc_norm": 44.97, "cmmlu-high_school_psychology_5shot_acc_norm": 70.46, "cmmlu-high_school_statistics_5shot_acc_norm": 50, "cmmlu-high_school_us_history_5shot_acc_norm": 70.1, "cmmlu-high_school_mathematics_5shot_acc_norm": 35.93, "cmmlu-professional_accounting_5shot_acc_norm": 46.1, "cmmlu-professional_psychology_5shot_acc_norm": 58.17, "cmmlu-college_computer_science_5shot_acc_norm": 51, "cmmlu-high_school_world_history_5shot_acc_norm": 73.84, "cmmlu-high_school_macroeconomics_5shot_acc_norm": 61.03, "cmmlu-high_school_microeconomics_5shot_acc_norm": 69.75, "cmmlu-high_school_computer_science_5shot_acc_norm": 63, "cmmlu-high_school_european_history_5shot_acc_norm": 70.3, "cmmlu-high_school_government_and_politics_5shot_acc_norm": 73.58 } }
{}
{}
{}
{}
{ "model_name": "01-ai/Yi-1.5-9B-32K", "model_dtype": "float16", "model_size": 0 }
{ "harness-c_arc_challenge": { "acc_norm": 55.2, "acc_stderr": 0, "c_arc_challenge_25shot_acc": 50, "c_arc_challenge_25shot_acc_norm": 55.2 }, "harness-c_gsm8k": { "acc": 0, "acc_stderr": 0, "c_gsm8k_5shot_acc": 0 }, "harness-c_hellaswag": { "acc_norm": 62.5, "acc_stderr": 0, "c_hellaswag_10shot_acc": 45.66, "c_hellaswag_10shot_acc_norm": 62.5 }, "harness-c-sem-v2": { "acc": 80.94749999999999, "acc_stderr": 0, "c_sem_v2-LLSRC_5shot_acc": 86.91, "c_sem_v2-SLPWC_5shot_acc": 73.57, "c_sem_v2-SLRFC_5shot_acc": 80, "c_sem_v2-SLSRC_5shot_acc": 83.31, "c_sem_v2-LLSRC_5shot_acc_norm": 86.91, "c_sem_v2-SLPWC_5shot_acc_norm": 73.57, "c_sem_v2-SLRFC_5shot_acc_norm": 80, "c_sem_v2-SLSRC_5shot_acc_norm": 83.31 }, "harness-c_truthfulqa_mc": { "mc2": 45.22, "acc_stderr": 0, "c_truthfulqa_mc_0shot_mc1": 24.85, "c_truthfulqa_mc_0shot_mc2": 45.22 }, "harness-c_winogrande": { "acc": 64.72, "acc_stderr": 0, "c_winogrande_0shot_acc": 64.72 }, "harness-cmmlu": { "acc_norm": 63.02, "acc_stderr": 0, "cmmlu-anatomy_5shot_acc": 46.67, "cmmlu_fullavg_5shot_acc": 63.02, "cmmlu-virology_5shot_acc": 46.99, "cmmlu-astronomy_5shot_acc": 69.08, "cmmlu-marketing_5shot_acc": 81.2, "cmmlu-nutrition_5shot_acc": 68.95, "cmmlu-sociology_5shot_acc": 78.11, "cmmlu-management_5shot_acc": 77.67, "cmmlu-philosophy_5shot_acc": 68.81, "cmmlu-prehistory_5shot_acc": 61.73, "cmmlu-human_aging_5shot_acc": 64.57, "cmmlu-econometrics_5shot_acc": 52.63, "cmmlu-formal_logic_5shot_acc": 45.6, "cmmlu-global_facts_5shot_acc": 45, "cmmlu-jurisprudence_5shot_acc": 77.78, "cmmlu-miscellaneous_5shot_acc": 71.26, "cmmlu-moral_disputes_5shot_acc": 65.9, "cmmlu-business_ethics_5shot_acc": 71, "cmmlu-college_biology_5shot_acc": 61.11, "cmmlu-college_physics_5shot_acc": 43.14, "cmmlu-human_sexuality_5shot_acc": 65.65, "cmmlu-moral_scenarios_5shot_acc": 36.76, "cmmlu-world_religions_5shot_acc": 67.84, "cmmlu-abstract_algebra_5shot_acc": 35, "cmmlu-college_medicine_5shot_acc": 63.01, "cmmlu-machine_learning_5shot_acc": 47.32, "cmmlu-medical_genetics_5shot_acc": 63, "cmmlu-professional_law_5shot_acc": 43.94, "cmmlu-public_relations_5shot_acc": 66.36, "cmmlu-security_studies_5shot_acc": 75.1, "cmmlu-college_chemistry_5shot_acc": 52, "cmmlu-computer_security_5shot_acc": 71, "cmmlu-international_law_5shot_acc": 80.17, "cmmlu-logical_fallacies_5shot_acc": 65.64, "cmmlu-us_foreign_policy_5shot_acc": 83, "cmmlu-clinical_knowledge_5shot_acc": 66.04, "cmmlu-conceptual_physics_5shot_acc": 59.15, "cmmlu-college_mathematics_5shot_acc": 47, "cmmlu-high_school_biology_5shot_acc": 75.48, "cmmlu-high_school_physics_5shot_acc": 41.72, "cmmlu-high_school_chemistry_5shot_acc": 51.23, "cmmlu-high_school_geography_5shot_acc": 78.79, "cmmlu-professional_medicine_5shot_acc": 59.19, "cmmlu-electrical_engineering_5shot_acc": 68.28, "cmmlu-elementary_mathematics_5shot_acc": 55.56, "cmmlu-high_school_psychology_5shot_acc": 78.35, "cmmlu-high_school_statistics_5shot_acc": 58.8, "cmmlu-high_school_us_history_5shot_acc": 81.37, "cmmlu-high_school_mathematics_5shot_acc": 40, "cmmlu-professional_accounting_5shot_acc": 51.42, "cmmlu-professional_psychology_5shot_acc": 62.09, "cmmlu-college_computer_science_5shot_acc": 48, "cmmlu-high_school_world_history_5shot_acc": 78.48, "cmmlu-high_school_macroeconomics_5shot_acc": 69.49, "cmmlu-high_school_microeconomics_5shot_acc": 75.21, "cmmlu-high_school_computer_science_5shot_acc": 76, "cmmlu-high_school_european_history_5shot_acc": 78.18, "cmmlu-high_school_government_and_politics_5shot_acc": 79.27, "cmmlu-anatomy_5shot_acc_norm": 46.67, "cmmlu_fullavg_5shot_acc_norm": 63.02, "cmmlu-virology_5shot_acc_norm": 46.99, "cmmlu-astronomy_5shot_acc_norm": 69.08, "cmmlu-marketing_5shot_acc_norm": 81.2, "cmmlu-nutrition_5shot_acc_norm": 68.95, "cmmlu-sociology_5shot_acc_norm": 78.11, "cmmlu-management_5shot_acc_norm": 77.67, "cmmlu-philosophy_5shot_acc_norm": 68.81, "cmmlu-prehistory_5shot_acc_norm": 61.73, "cmmlu-human_aging_5shot_acc_norm": 64.57, "cmmlu-econometrics_5shot_acc_norm": 52.63, "cmmlu-formal_logic_5shot_acc_norm": 45.6, "cmmlu-global_facts_5shot_acc_norm": 45, "cmmlu-jurisprudence_5shot_acc_norm": 77.78, "cmmlu-miscellaneous_5shot_acc_norm": 71.26, "cmmlu-moral_disputes_5shot_acc_norm": 65.9, "cmmlu-business_ethics_5shot_acc_norm": 71, "cmmlu-college_biology_5shot_acc_norm": 61.11, "cmmlu-college_physics_5shot_acc_norm": 43.14, "cmmlu-human_sexuality_5shot_acc_norm": 65.65, "cmmlu-moral_scenarios_5shot_acc_norm": 36.76, "cmmlu-world_religions_5shot_acc_norm": 67.84, "cmmlu-abstract_algebra_5shot_acc_norm": 35, "cmmlu-college_medicine_5shot_acc_norm": 63.01, "cmmlu-machine_learning_5shot_acc_norm": 47.32, "cmmlu-medical_genetics_5shot_acc_norm": 63, "cmmlu-professional_law_5shot_acc_norm": 43.94, "cmmlu-public_relations_5shot_acc_norm": 66.36, "cmmlu-security_studies_5shot_acc_norm": 75.1, "cmmlu-college_chemistry_5shot_acc_norm": 52, "cmmlu-computer_security_5shot_acc_norm": 71, "cmmlu-international_law_5shot_acc_norm": 80.17, "cmmlu-logical_fallacies_5shot_acc_norm": 65.64, "cmmlu-us_foreign_policy_5shot_acc_norm": 83, "cmmlu-clinical_knowledge_5shot_acc_norm": 66.04, "cmmlu-conceptual_physics_5shot_acc_norm": 59.15, "cmmlu-college_mathematics_5shot_acc_norm": 47, "cmmlu-high_school_biology_5shot_acc_norm": 75.48, "cmmlu-high_school_physics_5shot_acc_norm": 41.72, "cmmlu-high_school_chemistry_5shot_acc_norm": 51.23, "cmmlu-high_school_geography_5shot_acc_norm": 78.79, "cmmlu-professional_medicine_5shot_acc_norm": 59.19, "cmmlu-electrical_engineering_5shot_acc_norm": 68.28, "cmmlu-elementary_mathematics_5shot_acc_norm": 55.56, "cmmlu-high_school_psychology_5shot_acc_norm": 78.35, "cmmlu-high_school_statistics_5shot_acc_norm": 58.8, "cmmlu-high_school_us_history_5shot_acc_norm": 81.37, "cmmlu-high_school_mathematics_5shot_acc_norm": 40, "cmmlu-professional_accounting_5shot_acc_norm": 51.42, "cmmlu-professional_psychology_5shot_acc_norm": 62.09, "cmmlu-college_computer_science_5shot_acc_norm": 48, "cmmlu-high_school_world_history_5shot_acc_norm": 78.48, "cmmlu-high_school_macroeconomics_5shot_acc_norm": 69.49, "cmmlu-high_school_microeconomics_5shot_acc_norm": 75.21, "cmmlu-high_school_computer_science_5shot_acc_norm": 76, "cmmlu-high_school_european_history_5shot_acc_norm": 78.18, "cmmlu-high_school_government_and_politics_5shot_acc_norm": 79.27 } }
{}
{}
{}
{}
{ "model_name": "01-ai/Yi-1.5-9B-Chat-16K", "model_dtype": "float16", "model_size": 0 }
{ "harness-c_arc_challenge": { "acc_norm": 57.51, "acc_stderr": 0, "c_arc_challenge_25shot_acc": 52.65, "c_arc_challenge_25shot_acc_norm": 57.51 }, "harness-c_gsm8k": { "acc": 59.97, "acc_stderr": 0, "c_gsm8k_5shot_acc": 59.97 }, "harness-c_hellaswag": { "acc_norm": 64.82, "acc_stderr": 0, "c_hellaswag_10shot_acc": 48.21, "c_hellaswag_10shot_acc_norm": 64.82 }, "harness-c-sem-v2": { "acc": 86.08250000000001, "acc_stderr": 0, "c_sem_v2-LLSRC_5shot_acc": 88.06, "c_sem_v2-SLPWC_5shot_acc": 79.43, "c_sem_v2-SLRFC_5shot_acc": 92.09, "c_sem_v2-SLSRC_5shot_acc": 84.75, "c_sem_v2-LLSRC_5shot_acc_norm": 88.06, "c_sem_v2-SLPWC_5shot_acc_norm": 79.43, "c_sem_v2-SLRFC_5shot_acc_norm": 92.09, "c_sem_v2-SLSRC_5shot_acc_norm": 84.75 }, "harness-c_truthfulqa_mc": { "mc2": 53.88, "acc_stderr": 0, "c_truthfulqa_mc_0shot_mc1": 31.46, "c_truthfulqa_mc_0shot_mc2": 53.88 }, "harness-c_winogrande": { "acc": 65.59, "acc_stderr": 0, "c_winogrande_0shot_acc": 65.59 }, "CLCC-H": { "acc": 0.4904, "acc_stderr": 0 }, "harness-cmmlu": { "acc_norm": 63.4, "acc_stderr": 0, "cmmlu-anatomy_5shot_acc": 57.04, "cmmlu_fullavg_5shot_acc": 63.4, "cmmlu-virology_5shot_acc": 45.18, "cmmlu-astronomy_5shot_acc": 65.79, "cmmlu-marketing_5shot_acc": 80.77, "cmmlu-nutrition_5shot_acc": 68.95, "cmmlu-sociology_5shot_acc": 78.11, "cmmlu-management_5shot_acc": 75.73, "cmmlu-philosophy_5shot_acc": 63.34, "cmmlu-prehistory_5shot_acc": 61.73, "cmmlu-human_aging_5shot_acc": 68.16, "cmmlu-econometrics_5shot_acc": 56.14, "cmmlu-formal_logic_5shot_acc": 46.4, "cmmlu-global_facts_5shot_acc": 45, "cmmlu-jurisprudence_5shot_acc": 80.56, "cmmlu-miscellaneous_5shot_acc": 71.26, "cmmlu-moral_disputes_5shot_acc": 69.36, "cmmlu-business_ethics_5shot_acc": 72, "cmmlu-college_biology_5shot_acc": 61.11, "cmmlu-college_physics_5shot_acc": 46.08, "cmmlu-human_sexuality_5shot_acc": 72.52, "cmmlu-moral_scenarios_5shot_acc": 41.23, "cmmlu-world_religions_5shot_acc": 64.91, "cmmlu-abstract_algebra_5shot_acc": 32, "cmmlu-college_medicine_5shot_acc": 64.16, "cmmlu-machine_learning_5shot_acc": 48.21, "cmmlu-medical_genetics_5shot_acc": 65, "cmmlu-professional_law_5shot_acc": 44.2, "cmmlu-public_relations_5shot_acc": 61.82, "cmmlu-security_studies_5shot_acc": 73.47, "cmmlu-college_chemistry_5shot_acc": 57, "cmmlu-computer_security_5shot_acc": 76, "cmmlu-international_law_5shot_acc": 76.03, "cmmlu-logical_fallacies_5shot_acc": 61.35, "cmmlu-us_foreign_policy_5shot_acc": 86, "cmmlu-clinical_knowledge_5shot_acc": 68.3, "cmmlu-conceptual_physics_5shot_acc": 64.68, "cmmlu-college_mathematics_5shot_acc": 38, "cmmlu-high_school_biology_5shot_acc": 74.84, "cmmlu-high_school_physics_5shot_acc": 47.02, "cmmlu-high_school_chemistry_5shot_acc": 57.14, "cmmlu-high_school_geography_5shot_acc": 82.32, "cmmlu-professional_medicine_5shot_acc": 51.47, "cmmlu-electrical_engineering_5shot_acc": 63.45, "cmmlu-elementary_mathematics_5shot_acc": 59.52, "cmmlu-high_school_psychology_5shot_acc": 78.72, "cmmlu-high_school_statistics_5shot_acc": 58.33, "cmmlu-high_school_us_history_5shot_acc": 79.41, "cmmlu-high_school_mathematics_5shot_acc": 39.63, "cmmlu-professional_accounting_5shot_acc": 47.16, "cmmlu-professional_psychology_5shot_acc": 59.64, "cmmlu-college_computer_science_5shot_acc": 45, "cmmlu-high_school_world_history_5shot_acc": 79.32, "cmmlu-high_school_macroeconomics_5shot_acc": 71.03, "cmmlu-high_school_microeconomics_5shot_acc": 76.05, "cmmlu-high_school_computer_science_5shot_acc": 76, "cmmlu-high_school_european_history_5shot_acc": 83.03, "cmmlu-high_school_government_and_politics_5shot_acc": 77.2, "cmmlu-anatomy_5shot_acc_norm": 57.04, "cmmlu_fullavg_5shot_acc_norm": 63.4, "cmmlu-virology_5shot_acc_norm": 45.18, "cmmlu-astronomy_5shot_acc_norm": 65.79, "cmmlu-marketing_5shot_acc_norm": 80.77, "cmmlu-nutrition_5shot_acc_norm": 68.95, "cmmlu-sociology_5shot_acc_norm": 78.11, "cmmlu-management_5shot_acc_norm": 75.73, "cmmlu-philosophy_5shot_acc_norm": 63.34, "cmmlu-prehistory_5shot_acc_norm": 61.73, "cmmlu-human_aging_5shot_acc_norm": 68.16, "cmmlu-econometrics_5shot_acc_norm": 56.14, "cmmlu-formal_logic_5shot_acc_norm": 46.4, "cmmlu-global_facts_5shot_acc_norm": 45, "cmmlu-jurisprudence_5shot_acc_norm": 80.56, "cmmlu-miscellaneous_5shot_acc_norm": 71.26, "cmmlu-moral_disputes_5shot_acc_norm": 69.36, "cmmlu-business_ethics_5shot_acc_norm": 72, "cmmlu-college_biology_5shot_acc_norm": 61.11, "cmmlu-college_physics_5shot_acc_norm": 46.08, "cmmlu-human_sexuality_5shot_acc_norm": 72.52, "cmmlu-moral_scenarios_5shot_acc_norm": 41.23, "cmmlu-world_religions_5shot_acc_norm": 64.91, "cmmlu-abstract_algebra_5shot_acc_norm": 32, "cmmlu-college_medicine_5shot_acc_norm": 64.16, "cmmlu-machine_learning_5shot_acc_norm": 48.21, "cmmlu-medical_genetics_5shot_acc_norm": 65, "cmmlu-professional_law_5shot_acc_norm": 44.2, "cmmlu-public_relations_5shot_acc_norm": 61.82, "cmmlu-security_studies_5shot_acc_norm": 73.47, "cmmlu-college_chemistry_5shot_acc_norm": 57, "cmmlu-computer_security_5shot_acc_norm": 76, "cmmlu-international_law_5shot_acc_norm": 76.03, "cmmlu-logical_fallacies_5shot_acc_norm": 61.35, "cmmlu-us_foreign_policy_5shot_acc_norm": 86, "cmmlu-clinical_knowledge_5shot_acc_norm": 68.3, "cmmlu-conceptual_physics_5shot_acc_norm": 64.68, "cmmlu-college_mathematics_5shot_acc_norm": 38, "cmmlu-high_school_biology_5shot_acc_norm": 74.84, "cmmlu-high_school_physics_5shot_acc_norm": 47.02, "cmmlu-high_school_chemistry_5shot_acc_norm": 57.14, "cmmlu-high_school_geography_5shot_acc_norm": 82.32, "cmmlu-professional_medicine_5shot_acc_norm": 51.47, "cmmlu-electrical_engineering_5shot_acc_norm": 63.45, "cmmlu-elementary_mathematics_5shot_acc_norm": 59.52, "cmmlu-high_school_psychology_5shot_acc_norm": 78.72, "cmmlu-high_school_statistics_5shot_acc_norm": 58.33, "cmmlu-high_school_us_history_5shot_acc_norm": 79.41, "cmmlu-high_school_mathematics_5shot_acc_norm": 39.63, "cmmlu-professional_accounting_5shot_acc_norm": 47.16, "cmmlu-professional_psychology_5shot_acc_norm": 59.64, "cmmlu-college_computer_science_5shot_acc_norm": 45, "cmmlu-high_school_world_history_5shot_acc_norm": 79.32, "cmmlu-high_school_macroeconomics_5shot_acc_norm": 71.03, "cmmlu-high_school_microeconomics_5shot_acc_norm": 76.05, "cmmlu-high_school_computer_science_5shot_acc_norm": 76, "cmmlu-high_school_european_history_5shot_acc_norm": 83.03, "cmmlu-high_school_government_and_politics_5shot_acc_norm": 77.2 } }
{}
{}
{}
{}
{ "model_name": "01-ai/Yi-1.5-9B-Chat", "model_dtype": "float16", "model_size": 0 }
{ "harness-c_arc_challenge": { "acc_norm": 56.83, "acc_stderr": 0, "c_arc_challenge_25shot_acc": 52.22, "c_arc_challenge_25shot_acc_norm": 56.83 }, "harness-c_gsm8k": { "acc": 64.22, "acc_stderr": 0, "c_gsm8k_5shot_acc": 64.22 }, "harness-c_hellaswag": { "acc_norm": 62.81, "acc_stderr": 0, "c_hellaswag_10shot_acc": 46.81, "c_hellaswag_10shot_acc_norm": 62.81 }, "harness-c-sem-v2": { "acc": 84.5375, "acc_stderr": 0, "c_sem_v2-LLSRC_5shot_acc": 88.35, "c_sem_v2-SLPWC_5shot_acc": 79.29, "c_sem_v2-SLRFC_5shot_acc": 88.78, "c_sem_v2-SLSRC_5shot_acc": 81.73, "c_sem_v2-LLSRC_5shot_acc_norm": 88.35, "c_sem_v2-SLPWC_5shot_acc_norm": 79.29, "c_sem_v2-SLRFC_5shot_acc_norm": 88.78, "c_sem_v2-SLSRC_5shot_acc_norm": 81.73 }, "harness-c_truthfulqa_mc": { "mc2": 54.1, "acc_stderr": 0, "c_truthfulqa_mc_0shot_mc1": 32.56, "c_truthfulqa_mc_0shot_mc2": 54.1 }, "harness-c_winogrande": { "acc": 64.96, "acc_stderr": 0, "c_winogrande_0shot_acc": 64.96 }, "CLCC-H": { "acc": 0.6576, "acc_stderr": 0 }, "harness-cmmlu": { "acc_norm": 61.02, "acc_stderr": 0, "cmmlu-anatomy_5shot_acc": 53.33, "cmmlu_fullavg_5shot_acc": 61.02, "cmmlu-virology_5shot_acc": 43.37, "cmmlu-astronomy_5shot_acc": 67.76, "cmmlu-marketing_5shot_acc": 83.33, "cmmlu-nutrition_5shot_acc": 68.3, "cmmlu-sociology_5shot_acc": 73.13, "cmmlu-management_5shot_acc": 72.82, "cmmlu-philosophy_5shot_acc": 58.84, "cmmlu-prehistory_5shot_acc": 59.57, "cmmlu-human_aging_5shot_acc": 62.78, "cmmlu-econometrics_5shot_acc": 46.49, "cmmlu-formal_logic_5shot_acc": 44, "cmmlu-global_facts_5shot_acc": 40, "cmmlu-jurisprudence_5shot_acc": 66.67, "cmmlu-miscellaneous_5shot_acc": 64.75, "cmmlu-moral_disputes_5shot_acc": 64.74, "cmmlu-business_ethics_5shot_acc": 67, "cmmlu-college_biology_5shot_acc": 57.64, "cmmlu-college_physics_5shot_acc": 41.18, "cmmlu-human_sexuality_5shot_acc": 67.94, "cmmlu-moral_scenarios_5shot_acc": 39.55, "cmmlu-world_religions_5shot_acc": 65.5, "cmmlu-abstract_algebra_5shot_acc": 29, "cmmlu-college_medicine_5shot_acc": 65.9, "cmmlu-machine_learning_5shot_acc": 47.32, "cmmlu-medical_genetics_5shot_acc": 61, "cmmlu-professional_law_5shot_acc": 42.24, "cmmlu-public_relations_5shot_acc": 53.64, "cmmlu-security_studies_5shot_acc": 70.2, "cmmlu-college_chemistry_5shot_acc": 53, "cmmlu-computer_security_5shot_acc": 67, "cmmlu-international_law_5shot_acc": 75.21, "cmmlu-logical_fallacies_5shot_acc": 59.51, "cmmlu-us_foreign_policy_5shot_acc": 79, "cmmlu-clinical_knowledge_5shot_acc": 64.53, "cmmlu-conceptual_physics_5shot_acc": 65.96, "cmmlu-college_mathematics_5shot_acc": 49, "cmmlu-high_school_biology_5shot_acc": 73.55, "cmmlu-high_school_physics_5shot_acc": 47.02, "cmmlu-high_school_chemistry_5shot_acc": 55.67, "cmmlu-high_school_geography_5shot_acc": 72.22, "cmmlu-professional_medicine_5shot_acc": 54.78, "cmmlu-electrical_engineering_5shot_acc": 65.52, "cmmlu-elementary_mathematics_5shot_acc": 58.73, "cmmlu-high_school_psychology_5shot_acc": 75.6, "cmmlu-high_school_statistics_5shot_acc": 60.65, "cmmlu-high_school_us_history_5shot_acc": 71.57, "cmmlu-high_school_mathematics_5shot_acc": 42.22, "cmmlu-professional_accounting_5shot_acc": 50.35, "cmmlu-professional_psychology_5shot_acc": 59.31, "cmmlu-college_computer_science_5shot_acc": 57, "cmmlu-high_school_world_history_5shot_acc": 78.06, "cmmlu-high_school_macroeconomics_5shot_acc": 67.95, "cmmlu-high_school_microeconomics_5shot_acc": 79.41, "cmmlu-high_school_computer_science_5shot_acc": 73, "cmmlu-high_school_european_history_5shot_acc": 75.15, "cmmlu-high_school_government_and_politics_5shot_acc": 68.91, "cmmlu-anatomy_5shot_acc_norm": 53.33, "cmmlu_fullavg_5shot_acc_norm": 61.02, "cmmlu-virology_5shot_acc_norm": 43.37, "cmmlu-astronomy_5shot_acc_norm": 67.76, "cmmlu-marketing_5shot_acc_norm": 83.33, "cmmlu-nutrition_5shot_acc_norm": 68.3, "cmmlu-sociology_5shot_acc_norm": 73.13, "cmmlu-management_5shot_acc_norm": 72.82, "cmmlu-philosophy_5shot_acc_norm": 58.84, "cmmlu-prehistory_5shot_acc_norm": 59.57, "cmmlu-human_aging_5shot_acc_norm": 62.78, "cmmlu-econometrics_5shot_acc_norm": 46.49, "cmmlu-formal_logic_5shot_acc_norm": 44, "cmmlu-global_facts_5shot_acc_norm": 40, "cmmlu-jurisprudence_5shot_acc_norm": 66.67, "cmmlu-miscellaneous_5shot_acc_norm": 64.75, "cmmlu-moral_disputes_5shot_acc_norm": 64.74, "cmmlu-business_ethics_5shot_acc_norm": 67, "cmmlu-college_biology_5shot_acc_norm": 57.64, "cmmlu-college_physics_5shot_acc_norm": 41.18, "cmmlu-human_sexuality_5shot_acc_norm": 67.94, "cmmlu-moral_scenarios_5shot_acc_norm": 39.55, "cmmlu-world_religions_5shot_acc_norm": 65.5, "cmmlu-abstract_algebra_5shot_acc_norm": 29, "cmmlu-college_medicine_5shot_acc_norm": 65.9, "cmmlu-machine_learning_5shot_acc_norm": 47.32, "cmmlu-medical_genetics_5shot_acc_norm": 61, "cmmlu-professional_law_5shot_acc_norm": 42.24, "cmmlu-public_relations_5shot_acc_norm": 53.64, "cmmlu-security_studies_5shot_acc_norm": 70.2, "cmmlu-college_chemistry_5shot_acc_norm": 53, "cmmlu-computer_security_5shot_acc_norm": 67, "cmmlu-international_law_5shot_acc_norm": 75.21, "cmmlu-logical_fallacies_5shot_acc_norm": 59.51, "cmmlu-us_foreign_policy_5shot_acc_norm": 79, "cmmlu-clinical_knowledge_5shot_acc_norm": 64.53, "cmmlu-conceptual_physics_5shot_acc_norm": 65.96, "cmmlu-college_mathematics_5shot_acc_norm": 49, "cmmlu-high_school_biology_5shot_acc_norm": 73.55, "cmmlu-high_school_physics_5shot_acc_norm": 47.02, "cmmlu-high_school_chemistry_5shot_acc_norm": 55.67, "cmmlu-high_school_geography_5shot_acc_norm": 72.22, "cmmlu-professional_medicine_5shot_acc_norm": 54.78, "cmmlu-electrical_engineering_5shot_acc_norm": 65.52, "cmmlu-elementary_mathematics_5shot_acc_norm": 58.73, "cmmlu-high_school_psychology_5shot_acc_norm": 75.6, "cmmlu-high_school_statistics_5shot_acc_norm": 60.65, "cmmlu-high_school_us_history_5shot_acc_norm": 71.57, "cmmlu-high_school_mathematics_5shot_acc_norm": 42.22, "cmmlu-professional_accounting_5shot_acc_norm": 50.35, "cmmlu-professional_psychology_5shot_acc_norm": 59.31, "cmmlu-college_computer_science_5shot_acc_norm": 57, "cmmlu-high_school_world_history_5shot_acc_norm": 78.06, "cmmlu-high_school_macroeconomics_5shot_acc_norm": 67.95, "cmmlu-high_school_microeconomics_5shot_acc_norm": 79.41, "cmmlu-high_school_computer_science_5shot_acc_norm": 73, "cmmlu-high_school_european_history_5shot_acc_norm": 75.15, "cmmlu-high_school_government_and_politics_5shot_acc_norm": 68.91 } }
{}
{}
{}
{}
{ "model_name": "01-ai/Yi-1.5-9B", "model_dtype": "float16", "model_size": 0 }
{ "harness-c_arc_challenge": { "acc_norm": 56.23, "acc_stderr": 0, "c_arc_challenge_25shot_acc": 49.74, "c_arc_challenge_25shot_acc_norm": 56.23 }, "harness-c_gsm8k": { "acc": 51.1, "acc_stderr": 0, "c_gsm8k_5shot_acc": 51.1 }, "harness-c_hellaswag": { "acc_norm": 63.42, "acc_stderr": 0, "c_hellaswag_10shot_acc": 46.4, "c_hellaswag_10shot_acc_norm": 63.42 }, "harness-c-sem-v2": { "acc": 82.02499999999999, "acc_stderr": 0, "c_sem_v2-LLSRC_5shot_acc": 86.76, "c_sem_v2-SLPWC_5shot_acc": 73.71, "c_sem_v2-SLRFC_5shot_acc": 83.17, "c_sem_v2-SLSRC_5shot_acc": 84.46, "c_sem_v2-LLSRC_5shot_acc_norm": 86.76, "c_sem_v2-SLPWC_5shot_acc_norm": 73.71, "c_sem_v2-SLRFC_5shot_acc_norm": 83.17, "c_sem_v2-SLSRC_5shot_acc_norm": 84.46 }, "harness-c_truthfulqa_mc": { "mc2": 49.92, "acc_stderr": 0, "c_truthfulqa_mc_0shot_mc1": 27.66, "c_truthfulqa_mc_0shot_mc2": 49.92 }, "harness-c_winogrande": { "acc": 65.67, "acc_stderr": 0, "c_winogrande_0shot_acc": 65.67 }, "harness-cmmlu": { "acc_norm": 64.27, "acc_stderr": 0, "cmmlu-anatomy_5shot_acc": 53.33, "cmmlu_fullavg_5shot_acc": 64.27, "cmmlu-virology_5shot_acc": 49.4, "cmmlu-astronomy_5shot_acc": 77.63, "cmmlu-marketing_5shot_acc": 82.91, "cmmlu-nutrition_5shot_acc": 70.59, "cmmlu-sociology_5shot_acc": 80.1, "cmmlu-management_5shot_acc": 76.7, "cmmlu-philosophy_5shot_acc": 65.92, "cmmlu-prehistory_5shot_acc": 62.35, "cmmlu-human_aging_5shot_acc": 68.16, "cmmlu-econometrics_5shot_acc": 57.89, "cmmlu-formal_logic_5shot_acc": 48.8, "cmmlu-global_facts_5shot_acc": 43, "cmmlu-jurisprudence_5shot_acc": 75, "cmmlu-miscellaneous_5shot_acc": 72.54, "cmmlu-moral_disputes_5shot_acc": 67.34, "cmmlu-business_ethics_5shot_acc": 70, "cmmlu-college_biology_5shot_acc": 63.89, "cmmlu-college_physics_5shot_acc": 44.12, "cmmlu-human_sexuality_5shot_acc": 67.94, "cmmlu-moral_scenarios_5shot_acc": 33.97, "cmmlu-world_religions_5shot_acc": 70.18, "cmmlu-abstract_algebra_5shot_acc": 37, "cmmlu-college_medicine_5shot_acc": 63.58, "cmmlu-machine_learning_5shot_acc": 50, "cmmlu-medical_genetics_5shot_acc": 62, "cmmlu-professional_law_5shot_acc": 44.78, "cmmlu-public_relations_5shot_acc": 66.36, "cmmlu-security_studies_5shot_acc": 73.06, "cmmlu-college_chemistry_5shot_acc": 57, "cmmlu-computer_security_5shot_acc": 74, "cmmlu-international_law_5shot_acc": 79.34, "cmmlu-logical_fallacies_5shot_acc": 68.1, "cmmlu-us_foreign_policy_5shot_acc": 83, "cmmlu-clinical_knowledge_5shot_acc": 65.66, "cmmlu-conceptual_physics_5shot_acc": 64.26, "cmmlu-college_mathematics_5shot_acc": 46, "cmmlu-high_school_biology_5shot_acc": 77.42, "cmmlu-high_school_physics_5shot_acc": 41.06, "cmmlu-high_school_chemistry_5shot_acc": 63.05, "cmmlu-high_school_geography_5shot_acc": 80.81, "cmmlu-professional_medicine_5shot_acc": 61.03, "cmmlu-electrical_engineering_5shot_acc": 64.83, "cmmlu-elementary_mathematics_5shot_acc": 57.41, "cmmlu-high_school_psychology_5shot_acc": 78.17, "cmmlu-high_school_statistics_5shot_acc": 61.57, "cmmlu-high_school_us_history_5shot_acc": 81.86, "cmmlu-high_school_mathematics_5shot_acc": 37.78, "cmmlu-professional_accounting_5shot_acc": 51.42, "cmmlu-professional_psychology_5shot_acc": 63.73, "cmmlu-college_computer_science_5shot_acc": 49, "cmmlu-high_school_world_history_5shot_acc": 74.68, "cmmlu-high_school_macroeconomics_5shot_acc": 72.31, "cmmlu-high_school_microeconomics_5shot_acc": 75.21, "cmmlu-high_school_computer_science_5shot_acc": 78, "cmmlu-high_school_european_history_5shot_acc": 76.97, "cmmlu-high_school_government_and_politics_5shot_acc": 81.35, "cmmlu-anatomy_5shot_acc_norm": 53.33, "cmmlu_fullavg_5shot_acc_norm": 64.27, "cmmlu-virology_5shot_acc_norm": 49.4, "cmmlu-astronomy_5shot_acc_norm": 77.63, "cmmlu-marketing_5shot_acc_norm": 82.91, "cmmlu-nutrition_5shot_acc_norm": 70.59, "cmmlu-sociology_5shot_acc_norm": 80.1, "cmmlu-management_5shot_acc_norm": 76.7, "cmmlu-philosophy_5shot_acc_norm": 65.92, "cmmlu-prehistory_5shot_acc_norm": 62.35, "cmmlu-human_aging_5shot_acc_norm": 68.16, "cmmlu-econometrics_5shot_acc_norm": 57.89, "cmmlu-formal_logic_5shot_acc_norm": 48.8, "cmmlu-global_facts_5shot_acc_norm": 43, "cmmlu-jurisprudence_5shot_acc_norm": 75, "cmmlu-miscellaneous_5shot_acc_norm": 72.54, "cmmlu-moral_disputes_5shot_acc_norm": 67.34, "cmmlu-business_ethics_5shot_acc_norm": 70, "cmmlu-college_biology_5shot_acc_norm": 63.89, "cmmlu-college_physics_5shot_acc_norm": 44.12, "cmmlu-human_sexuality_5shot_acc_norm": 67.94, "cmmlu-moral_scenarios_5shot_acc_norm": 33.97, "cmmlu-world_religions_5shot_acc_norm": 70.18, "cmmlu-abstract_algebra_5shot_acc_norm": 37, "cmmlu-college_medicine_5shot_acc_norm": 63.58, "cmmlu-machine_learning_5shot_acc_norm": 50, "cmmlu-medical_genetics_5shot_acc_norm": 62, "cmmlu-professional_law_5shot_acc_norm": 44.78, "cmmlu-public_relations_5shot_acc_norm": 66.36, "cmmlu-security_studies_5shot_acc_norm": 73.06, "cmmlu-college_chemistry_5shot_acc_norm": 57, "cmmlu-computer_security_5shot_acc_norm": 74, "cmmlu-international_law_5shot_acc_norm": 79.34, "cmmlu-logical_fallacies_5shot_acc_norm": 68.1, "cmmlu-us_foreign_policy_5shot_acc_norm": 83, "cmmlu-clinical_knowledge_5shot_acc_norm": 65.66, "cmmlu-conceptual_physics_5shot_acc_norm": 64.26, "cmmlu-college_mathematics_5shot_acc_norm": 46, "cmmlu-high_school_biology_5shot_acc_norm": 77.42, "cmmlu-high_school_physics_5shot_acc_norm": 41.06, "cmmlu-high_school_chemistry_5shot_acc_norm": 63.05, "cmmlu-high_school_geography_5shot_acc_norm": 80.81, "cmmlu-professional_medicine_5shot_acc_norm": 61.03, "cmmlu-electrical_engineering_5shot_acc_norm": 64.83, "cmmlu-elementary_mathematics_5shot_acc_norm": 57.41, "cmmlu-high_school_psychology_5shot_acc_norm": 78.17, "cmmlu-high_school_statistics_5shot_acc_norm": 61.57, "cmmlu-high_school_us_history_5shot_acc_norm": 81.86, "cmmlu-high_school_mathematics_5shot_acc_norm": 37.78, "cmmlu-professional_accounting_5shot_acc_norm": 51.42, "cmmlu-professional_psychology_5shot_acc_norm": 63.73, "cmmlu-college_computer_science_5shot_acc_norm": 49, "cmmlu-high_school_world_history_5shot_acc_norm": 74.68, "cmmlu-high_school_macroeconomics_5shot_acc_norm": 72.31, "cmmlu-high_school_microeconomics_5shot_acc_norm": 75.21, "cmmlu-high_school_computer_science_5shot_acc_norm": 78, "cmmlu-high_school_european_history_5shot_acc_norm": 76.97, "cmmlu-high_school_government_and_politics_5shot_acc_norm": 81.35 } }
{}
{}
{}
{}
{ "model_name": "01-ai/Yi-34B-200K", "model_dtype": "float16", "model_size": 0 }
{ "harness-c_arc_challenge": { "acc_norm": 57.17, "acc_stderr": 0, "c_arc_challenge_25shot_acc": 51.96, "c_arc_challenge_25shot_acc_norm": 57.17 }, "harness-c_gsm8k": { "acc": 15.47, "acc_stderr": 0, "c_gsm8k_5shot_acc": 15.47 }, "harness-c_hellaswag": { "acc_norm": 66.01, "acc_stderr": 0, "c_hellaswag_10shot_acc": 47.27, "c_hellaswag_10shot_acc_norm": 66.01 }, "harness-c-sem-v2": { "acc": 85.7175, "acc_stderr": 0, "c_sem_v2-LLSRC_5shot_acc": 88.2, "c_sem_v2-SLPWC_5shot_acc": 80.71, "c_sem_v2-SLRFC_5shot_acc": 88.06, "c_sem_v2-SLSRC_5shot_acc": 85.9, "c_sem_v2-LLSRC_5shot_acc_norm": 88.2, "c_sem_v2-SLPWC_5shot_acc_norm": 80.71, "c_sem_v2-SLRFC_5shot_acc_norm": 88.06, "c_sem_v2-SLSRC_5shot_acc_norm": 85.9 }, "harness-c_truthfulqa_mc": { "mc2": 43.18, "acc_stderr": 0, "c_truthfulqa_mc_0shot_mc1": 23.5, "c_truthfulqa_mc_0shot_mc2": 43.18 }, "harness-c_winogrande": { "acc": 69.14, "acc_stderr": 0, "c_winogrande_0shot_acc": 69.14 }, "harness-cmmlu": { "acc_norm": 68.37, "acc_stderr": 0, "cmmlu-anatomy_5shot_acc": 60.74, "cmmlu_fullavg_5shot_acc": 68.37, "cmmlu-virology_5shot_acc": 53.01, "cmmlu-astronomy_5shot_acc": 77.63, "cmmlu-marketing_5shot_acc": 88.03, "cmmlu-nutrition_5shot_acc": 78.43, "cmmlu-sociology_5shot_acc": 81.59, "cmmlu-management_5shot_acc": 76.7, "cmmlu-philosophy_5shot_acc": 74.28, "cmmlu-prehistory_5shot_acc": 72.84, "cmmlu-human_aging_5shot_acc": 71.3, "cmmlu-econometrics_5shot_acc": 53.51, "cmmlu-formal_logic_5shot_acc": 44, "cmmlu-global_facts_5shot_acc": 48, "cmmlu-jurisprudence_5shot_acc": 83.33, "cmmlu-miscellaneous_5shot_acc": 78.8, "cmmlu-moral_disputes_5shot_acc": 72.83, "cmmlu-business_ethics_5shot_acc": 71, "cmmlu-college_biology_5shot_acc": 70.14, "cmmlu-college_physics_5shot_acc": 44.12, "cmmlu-human_sexuality_5shot_acc": 77.1, "cmmlu-moral_scenarios_5shot_acc": 51.51, "cmmlu-world_religions_5shot_acc": 77.19, "cmmlu-abstract_algebra_5shot_acc": 34, "cmmlu-college_medicine_5shot_acc": 70.52, "cmmlu-machine_learning_5shot_acc": 50, "cmmlu-medical_genetics_5shot_acc": 73, "cmmlu-professional_law_5shot_acc": 51.83, "cmmlu-public_relations_5shot_acc": 66.36, "cmmlu-security_studies_5shot_acc": 76.73, "cmmlu-college_chemistry_5shot_acc": 49, "cmmlu-computer_security_5shot_acc": 75, "cmmlu-international_law_5shot_acc": 85.12, "cmmlu-logical_fallacies_5shot_acc": 69.94, "cmmlu-us_foreign_policy_5shot_acc": 89, "cmmlu-clinical_knowledge_5shot_acc": 74.34, "cmmlu-conceptual_physics_5shot_acc": 71.49, "cmmlu-college_mathematics_5shot_acc": 38, "cmmlu-high_school_biology_5shot_acc": 81.94, "cmmlu-high_school_physics_5shot_acc": 46.36, "cmmlu-high_school_chemistry_5shot_acc": 57.14, "cmmlu-high_school_geography_5shot_acc": 85.86, "cmmlu-professional_medicine_5shot_acc": 71.69, "cmmlu-electrical_engineering_5shot_acc": 67.59, "cmmlu-elementary_mathematics_5shot_acc": 59.52, "cmmlu-high_school_psychology_5shot_acc": 82.94, "cmmlu-high_school_statistics_5shot_acc": 59.26, "cmmlu-high_school_us_history_5shot_acc": 86.27, "cmmlu-high_school_mathematics_5shot_acc": 42.59, "cmmlu-professional_accounting_5shot_acc": 58.16, "cmmlu-professional_psychology_5shot_acc": 71.24, "cmmlu-college_computer_science_5shot_acc": 57, "cmmlu-high_school_world_history_5shot_acc": 85.23, "cmmlu-high_school_macroeconomics_5shot_acc": 76.67, "cmmlu-high_school_microeconomics_5shot_acc": 79.83, "cmmlu-high_school_computer_science_5shot_acc": 78, "cmmlu-high_school_european_history_5shot_acc": 78.18, "cmmlu-high_school_government_and_politics_5shot_acc": 91.19, "cmmlu-anatomy_5shot_acc_norm": 60.74, "cmmlu_fullavg_5shot_acc_norm": 68.37, "cmmlu-virology_5shot_acc_norm": 53.01, "cmmlu-astronomy_5shot_acc_norm": 77.63, "cmmlu-marketing_5shot_acc_norm": 88.03, "cmmlu-nutrition_5shot_acc_norm": 78.43, "cmmlu-sociology_5shot_acc_norm": 81.59, "cmmlu-management_5shot_acc_norm": 76.7, "cmmlu-philosophy_5shot_acc_norm": 74.28, "cmmlu-prehistory_5shot_acc_norm": 72.84, "cmmlu-human_aging_5shot_acc_norm": 71.3, "cmmlu-econometrics_5shot_acc_norm": 53.51, "cmmlu-formal_logic_5shot_acc_norm": 44, "cmmlu-global_facts_5shot_acc_norm": 48, "cmmlu-jurisprudence_5shot_acc_norm": 83.33, "cmmlu-miscellaneous_5shot_acc_norm": 78.8, "cmmlu-moral_disputes_5shot_acc_norm": 72.83, "cmmlu-business_ethics_5shot_acc_norm": 71, "cmmlu-college_biology_5shot_acc_norm": 70.14, "cmmlu-college_physics_5shot_acc_norm": 44.12, "cmmlu-human_sexuality_5shot_acc_norm": 77.1, "cmmlu-moral_scenarios_5shot_acc_norm": 51.51, "cmmlu-world_religions_5shot_acc_norm": 77.19, "cmmlu-abstract_algebra_5shot_acc_norm": 34, "cmmlu-college_medicine_5shot_acc_norm": 70.52, "cmmlu-machine_learning_5shot_acc_norm": 50, "cmmlu-medical_genetics_5shot_acc_norm": 73, "cmmlu-professional_law_5shot_acc_norm": 51.83, "cmmlu-public_relations_5shot_acc_norm": 66.36, "cmmlu-security_studies_5shot_acc_norm": 76.73, "cmmlu-college_chemistry_5shot_acc_norm": 49, "cmmlu-computer_security_5shot_acc_norm": 75, "cmmlu-international_law_5shot_acc_norm": 85.12, "cmmlu-logical_fallacies_5shot_acc_norm": 69.94, "cmmlu-us_foreign_policy_5shot_acc_norm": 89, "cmmlu-clinical_knowledge_5shot_acc_norm": 74.34, "cmmlu-conceptual_physics_5shot_acc_norm": 71.49, "cmmlu-college_mathematics_5shot_acc_norm": 38, "cmmlu-high_school_biology_5shot_acc_norm": 81.94, "cmmlu-high_school_physics_5shot_acc_norm": 46.36, "cmmlu-high_school_chemistry_5shot_acc_norm": 57.14, "cmmlu-high_school_geography_5shot_acc_norm": 85.86, "cmmlu-professional_medicine_5shot_acc_norm": 71.69, "cmmlu-electrical_engineering_5shot_acc_norm": 67.59, "cmmlu-elementary_mathematics_5shot_acc_norm": 59.52, "cmmlu-high_school_psychology_5shot_acc_norm": 82.94, "cmmlu-high_school_statistics_5shot_acc_norm": 59.26, "cmmlu-high_school_us_history_5shot_acc_norm": 86.27, "cmmlu-high_school_mathematics_5shot_acc_norm": 42.59, "cmmlu-professional_accounting_5shot_acc_norm": 58.16, "cmmlu-professional_psychology_5shot_acc_norm": 71.24, "cmmlu-college_computer_science_5shot_acc_norm": 57, "cmmlu-high_school_world_history_5shot_acc_norm": 85.23, "cmmlu-high_school_macroeconomics_5shot_acc_norm": 76.67, "cmmlu-high_school_microeconomics_5shot_acc_norm": 79.83, "cmmlu-high_school_computer_science_5shot_acc_norm": 78, "cmmlu-high_school_european_history_5shot_acc_norm": 78.18, "cmmlu-high_school_government_and_politics_5shot_acc_norm": 91.19 } }
{}
{}
{}
{}
{ "model_name": "01-ai/Yi-34B-Chat", "model_dtype": "float16", "model_size": 0 }
{ "harness-c_arc_challenge": { "acc_norm": 58.62, "acc_stderr": 0, "c_arc_challenge_25shot_acc": 54.52, "c_arc_challenge_25shot_acc_norm": 58.62 }, "harness-c_gsm8k": { "acc": 0.61, "acc_stderr": 0, "c_gsm8k_5shot_acc": 0.61 }, "harness-c_hellaswag": { "acc_norm": 66.42, "acc_stderr": 0, "c_hellaswag_10shot_acc": 49.17, "c_hellaswag_10shot_acc_norm": 66.42 }, "harness-c-sem-v2": { "acc": 87.4025, "acc_stderr": 0, "c_sem_v2-LLSRC_5shot_acc": 88.63, "c_sem_v2-SLPWC_5shot_acc": 84.29, "c_sem_v2-SLRFC_5shot_acc": 90.22, "c_sem_v2-SLSRC_5shot_acc": 86.47, "c_sem_v2-LLSRC_5shot_acc_norm": 88.63, "c_sem_v2-SLPWC_5shot_acc_norm": 84.29, "c_sem_v2-SLRFC_5shot_acc_norm": 90.22, "c_sem_v2-SLSRC_5shot_acc_norm": 86.47 }, "harness-c_truthfulqa_mc": { "mc2": 51.3, "acc_stderr": 0, "c_truthfulqa_mc_0shot_mc1": 29.5, "c_truthfulqa_mc_0shot_mc2": 51.3 }, "harness-c_winogrande": { "acc": 66.46, "acc_stderr": 0, "c_winogrande_0shot_acc": 66.46 }, "CLCC-H": { "acc": 0.7643, "acc_stderr": 0 }, "harness-cmmlu": { "acc_norm": 68.3, "acc_stderr": 0, "cmmlu-anatomy_5shot_acc": 60, "cmmlu_fullavg_5shot_acc": 68.3, "cmmlu-virology_5shot_acc": 48.8, "cmmlu-astronomy_5shot_acc": 81.58, "cmmlu-marketing_5shot_acc": 88.89, "cmmlu-nutrition_5shot_acc": 76.14, "cmmlu-sociology_5shot_acc": 82.09, "cmmlu-management_5shot_acc": 75.73, "cmmlu-philosophy_5shot_acc": 72.03, "cmmlu-prehistory_5shot_acc": 72.84, "cmmlu-human_aging_5shot_acc": 69.96, "cmmlu-econometrics_5shot_acc": 50.88, "cmmlu-formal_logic_5shot_acc": 47.2, "cmmlu-global_facts_5shot_acc": 53, "cmmlu-jurisprudence_5shot_acc": 85.19, "cmmlu-miscellaneous_5shot_acc": 80.59, "cmmlu-moral_disputes_5shot_acc": 73.12, "cmmlu-business_ethics_5shot_acc": 72, "cmmlu-college_biology_5shot_acc": 67.36, "cmmlu-college_physics_5shot_acc": 46.08, "cmmlu-human_sexuality_5shot_acc": 80.15, "cmmlu-moral_scenarios_5shot_acc": 58.32, "cmmlu-world_religions_5shot_acc": 72.51, "cmmlu-abstract_algebra_5shot_acc": 35, "cmmlu-college_medicine_5shot_acc": 66.47, "cmmlu-machine_learning_5shot_acc": 50.89, "cmmlu-medical_genetics_5shot_acc": 75, "cmmlu-professional_law_5shot_acc": 48.57, "cmmlu-public_relations_5shot_acc": 68.18, "cmmlu-security_studies_5shot_acc": 82.45, "cmmlu-college_chemistry_5shot_acc": 42, "cmmlu-computer_security_5shot_acc": 78, "cmmlu-international_law_5shot_acc": 85.12, "cmmlu-logical_fallacies_5shot_acc": 72.39, "cmmlu-us_foreign_policy_5shot_acc": 85, "cmmlu-clinical_knowledge_5shot_acc": 73.21, "cmmlu-conceptual_physics_5shot_acc": 70.64, "cmmlu-college_mathematics_5shot_acc": 34, "cmmlu-high_school_biology_5shot_acc": 80.32, "cmmlu-high_school_physics_5shot_acc": 47.68, "cmmlu-high_school_chemistry_5shot_acc": 61.58, "cmmlu-high_school_geography_5shot_acc": 85.35, "cmmlu-professional_medicine_5shot_acc": 70.59, "cmmlu-electrical_engineering_5shot_acc": 64.14, "cmmlu-elementary_mathematics_5shot_acc": 59.79, "cmmlu-high_school_psychology_5shot_acc": 83.12, "cmmlu-high_school_statistics_5shot_acc": 59.72, "cmmlu-high_school_us_history_5shot_acc": 87.75, "cmmlu-high_school_mathematics_5shot_acc": 36.3, "cmmlu-professional_accounting_5shot_acc": 54.61, "cmmlu-professional_psychology_5shot_acc": 69.12, "cmmlu-college_computer_science_5shot_acc": 57, "cmmlu-high_school_world_history_5shot_acc": 86.08, "cmmlu-high_school_macroeconomics_5shot_acc": 75.9, "cmmlu-high_school_microeconomics_5shot_acc": 78.57, "cmmlu-high_school_computer_science_5shot_acc": 81, "cmmlu-high_school_european_history_5shot_acc": 82.42, "cmmlu-high_school_government_and_politics_5shot_acc": 90.67, "cmmlu-anatomy_5shot_acc_norm": 60, "cmmlu_fullavg_5shot_acc_norm": 68.3, "cmmlu-virology_5shot_acc_norm": 48.8, "cmmlu-astronomy_5shot_acc_norm": 81.58, "cmmlu-marketing_5shot_acc_norm": 88.89, "cmmlu-nutrition_5shot_acc_norm": 76.14, "cmmlu-sociology_5shot_acc_norm": 82.09, "cmmlu-management_5shot_acc_norm": 75.73, "cmmlu-philosophy_5shot_acc_norm": 72.03, "cmmlu-prehistory_5shot_acc_norm": 72.84, "cmmlu-human_aging_5shot_acc_norm": 69.96, "cmmlu-econometrics_5shot_acc_norm": 50.88, "cmmlu-formal_logic_5shot_acc_norm": 47.2, "cmmlu-global_facts_5shot_acc_norm": 53, "cmmlu-jurisprudence_5shot_acc_norm": 85.19, "cmmlu-miscellaneous_5shot_acc_norm": 80.59, "cmmlu-moral_disputes_5shot_acc_norm": 73.12, "cmmlu-business_ethics_5shot_acc_norm": 72, "cmmlu-college_biology_5shot_acc_norm": 67.36, "cmmlu-college_physics_5shot_acc_norm": 46.08, "cmmlu-human_sexuality_5shot_acc_norm": 80.15, "cmmlu-moral_scenarios_5shot_acc_norm": 58.32, "cmmlu-world_religions_5shot_acc_norm": 72.51, "cmmlu-abstract_algebra_5shot_acc_norm": 35, "cmmlu-college_medicine_5shot_acc_norm": 66.47, "cmmlu-machine_learning_5shot_acc_norm": 50.89, "cmmlu-medical_genetics_5shot_acc_norm": 75, "cmmlu-professional_law_5shot_acc_norm": 48.57, "cmmlu-public_relations_5shot_acc_norm": 68.18, "cmmlu-security_studies_5shot_acc_norm": 82.45, "cmmlu-college_chemistry_5shot_acc_norm": 42, "cmmlu-computer_security_5shot_acc_norm": 78, "cmmlu-international_law_5shot_acc_norm": 85.12, "cmmlu-logical_fallacies_5shot_acc_norm": 72.39, "cmmlu-us_foreign_policy_5shot_acc_norm": 85, "cmmlu-clinical_knowledge_5shot_acc_norm": 73.21, "cmmlu-conceptual_physics_5shot_acc_norm": 70.64, "cmmlu-college_mathematics_5shot_acc_norm": 34, "cmmlu-high_school_biology_5shot_acc_norm": 80.32, "cmmlu-high_school_physics_5shot_acc_norm": 47.68, "cmmlu-high_school_chemistry_5shot_acc_norm": 61.58, "cmmlu-high_school_geography_5shot_acc_norm": 85.35, "cmmlu-professional_medicine_5shot_acc_norm": 70.59, "cmmlu-electrical_engineering_5shot_acc_norm": 64.14, "cmmlu-elementary_mathematics_5shot_acc_norm": 59.79, "cmmlu-high_school_psychology_5shot_acc_norm": 83.12, "cmmlu-high_school_statistics_5shot_acc_norm": 59.72, "cmmlu-high_school_us_history_5shot_acc_norm": 87.75, "cmmlu-high_school_mathematics_5shot_acc_norm": 36.3, "cmmlu-professional_accounting_5shot_acc_norm": 54.61, "cmmlu-professional_psychology_5shot_acc_norm": 69.12, "cmmlu-college_computer_science_5shot_acc_norm": 57, "cmmlu-high_school_world_history_5shot_acc_norm": 86.08, "cmmlu-high_school_macroeconomics_5shot_acc_norm": 75.9, "cmmlu-high_school_microeconomics_5shot_acc_norm": 78.57, "cmmlu-high_school_computer_science_5shot_acc_norm": 81, "cmmlu-high_school_european_history_5shot_acc_norm": 82.42, "cmmlu-high_school_government_and_politics_5shot_acc_norm": 90.67 } }
{}
{}
{}
{}
{ "model_name": "01-ai/Yi-34B", "model_dtype": "float16", "model_size": 0 }
{ "harness-c_arc_challenge": { "acc_norm": 58.28, "acc_stderr": 0, "c_arc_challenge_25shot_acc": 54.18, "c_arc_challenge_25shot_acc_norm": 58.28 }, "harness-c_gsm8k": { "acc": 50.42, "acc_stderr": 0, "c_gsm8k_5shot_acc": 50.42 }, "harness-c_hellaswag": { "acc_norm": 68.92, "acc_stderr": 0, "c_hellaswag_10shot_acc": 50.17, "c_hellaswag_10shot_acc_norm": 68.92 }, "harness-c-sem-v2": { "acc": 87.8325, "acc_stderr": 0, "c_sem_v2-LLSRC_5shot_acc": 89.06, "c_sem_v2-SLPWC_5shot_acc": 83.57, "c_sem_v2-SLRFC_5shot_acc": 91.94, "c_sem_v2-SLSRC_5shot_acc": 86.76, "c_sem_v2-LLSRC_5shot_acc_norm": 89.06, "c_sem_v2-SLPWC_5shot_acc_norm": 83.57, "c_sem_v2-SLRFC_5shot_acc_norm": 91.94, "c_sem_v2-SLSRC_5shot_acc_norm": 86.76 }, "harness-c_truthfulqa_mc": { "mc2": 51.11, "acc_stderr": 0, "c_truthfulqa_mc_0shot_mc1": 31.09, "c_truthfulqa_mc_0shot_mc2": 51.11 }, "harness-c_winogrande": { "acc": 70.8, "acc_stderr": 0, "c_winogrande_0shot_acc": 70.8 }, "harness-cmmlu": { "acc_norm": 69.79, "acc_stderr": 0, "cmmlu-anatomy_5shot_acc": 64.44, "cmmlu_fullavg_5shot_acc": 69.79, "cmmlu-virology_5shot_acc": 54.22, "cmmlu-astronomy_5shot_acc": 83.55, "cmmlu-marketing_5shot_acc": 86.75, "cmmlu-nutrition_5shot_acc": 79.74, "cmmlu-sociology_5shot_acc": 84.58, "cmmlu-management_5shot_acc": 75.73, "cmmlu-philosophy_5shot_acc": 76.21, "cmmlu-prehistory_5shot_acc": 75.93, "cmmlu-human_aging_5shot_acc": 73.54, "cmmlu-econometrics_5shot_acc": 52.63, "cmmlu-formal_logic_5shot_acc": 40, "cmmlu-global_facts_5shot_acc": 51, "cmmlu-jurisprudence_5shot_acc": 81.48, "cmmlu-miscellaneous_5shot_acc": 80.2, "cmmlu-moral_disputes_5shot_acc": 74.86, "cmmlu-business_ethics_5shot_acc": 72, "cmmlu-college_biology_5shot_acc": 71.53, "cmmlu-college_physics_5shot_acc": 45.1, "cmmlu-human_sexuality_5shot_acc": 77.1, "cmmlu-moral_scenarios_5shot_acc": 51.62, "cmmlu-world_religions_5shot_acc": 73.1, "cmmlu-abstract_algebra_5shot_acc": 32, "cmmlu-college_medicine_5shot_acc": 73.99, "cmmlu-machine_learning_5shot_acc": 56.25, "cmmlu-medical_genetics_5shot_acc": 78, "cmmlu-professional_law_5shot_acc": 51.04, "cmmlu-public_relations_5shot_acc": 69.09, "cmmlu-security_studies_5shot_acc": 80.82, "cmmlu-college_chemistry_5shot_acc": 46, "cmmlu-computer_security_5shot_acc": 81, "cmmlu-international_law_5shot_acc": 88.43, "cmmlu-logical_fallacies_5shot_acc": 71.17, "cmmlu-us_foreign_policy_5shot_acc": 89, "cmmlu-clinical_knowledge_5shot_acc": 72.83, "cmmlu-conceptual_physics_5shot_acc": 69.79, "cmmlu-college_mathematics_5shot_acc": 44, "cmmlu-high_school_biology_5shot_acc": 81.61, "cmmlu-high_school_physics_5shot_acc": 44.37, "cmmlu-high_school_chemistry_5shot_acc": 58.13, "cmmlu-high_school_geography_5shot_acc": 83.33, "cmmlu-professional_medicine_5shot_acc": 76.84, "cmmlu-electrical_engineering_5shot_acc": 69.66, "cmmlu-elementary_mathematics_5shot_acc": 60.32, "cmmlu-high_school_psychology_5shot_acc": 84.4, "cmmlu-high_school_statistics_5shot_acc": 60.19, "cmmlu-high_school_us_history_5shot_acc": 88.73, "cmmlu-high_school_mathematics_5shot_acc": 44.07, "cmmlu-professional_accounting_5shot_acc": 60.64, "cmmlu-professional_psychology_5shot_acc": 72.39, "cmmlu-college_computer_science_5shot_acc": 64, "cmmlu-high_school_world_history_5shot_acc": 87.34, "cmmlu-high_school_macroeconomics_5shot_acc": 77.18, "cmmlu-high_school_microeconomics_5shot_acc": 78.99, "cmmlu-high_school_computer_science_5shot_acc": 85, "cmmlu-high_school_european_history_5shot_acc": 81.21, "cmmlu-high_school_government_and_politics_5shot_acc": 90.67, "cmmlu-anatomy_5shot_acc_norm": 64.44, "cmmlu_fullavg_5shot_acc_norm": 69.79, "cmmlu-virology_5shot_acc_norm": 54.22, "cmmlu-astronomy_5shot_acc_norm": 83.55, "cmmlu-marketing_5shot_acc_norm": 86.75, "cmmlu-nutrition_5shot_acc_norm": 79.74, "cmmlu-sociology_5shot_acc_norm": 84.58, "cmmlu-management_5shot_acc_norm": 75.73, "cmmlu-philosophy_5shot_acc_norm": 76.21, "cmmlu-prehistory_5shot_acc_norm": 75.93, "cmmlu-human_aging_5shot_acc_norm": 73.54, "cmmlu-econometrics_5shot_acc_norm": 52.63, "cmmlu-formal_logic_5shot_acc_norm": 40, "cmmlu-global_facts_5shot_acc_norm": 51, "cmmlu-jurisprudence_5shot_acc_norm": 81.48, "cmmlu-miscellaneous_5shot_acc_norm": 80.2, "cmmlu-moral_disputes_5shot_acc_norm": 74.86, "cmmlu-business_ethics_5shot_acc_norm": 72, "cmmlu-college_biology_5shot_acc_norm": 71.53, "cmmlu-college_physics_5shot_acc_norm": 45.1, "cmmlu-human_sexuality_5shot_acc_norm": 77.1, "cmmlu-moral_scenarios_5shot_acc_norm": 51.62, "cmmlu-world_religions_5shot_acc_norm": 73.1, "cmmlu-abstract_algebra_5shot_acc_norm": 32, "cmmlu-college_medicine_5shot_acc_norm": 73.99, "cmmlu-machine_learning_5shot_acc_norm": 56.25, "cmmlu-medical_genetics_5shot_acc_norm": 78, "cmmlu-professional_law_5shot_acc_norm": 51.04, "cmmlu-public_relations_5shot_acc_norm": 69.09, "cmmlu-security_studies_5shot_acc_norm": 80.82, "cmmlu-college_chemistry_5shot_acc_norm": 46, "cmmlu-computer_security_5shot_acc_norm": 81, "cmmlu-international_law_5shot_acc_norm": 88.43, "cmmlu-logical_fallacies_5shot_acc_norm": 71.17, "cmmlu-us_foreign_policy_5shot_acc_norm": 89, "cmmlu-clinical_knowledge_5shot_acc_norm": 72.83, "cmmlu-conceptual_physics_5shot_acc_norm": 69.79, "cmmlu-college_mathematics_5shot_acc_norm": 44, "cmmlu-high_school_biology_5shot_acc_norm": 81.61, "cmmlu-high_school_physics_5shot_acc_norm": 44.37, "cmmlu-high_school_chemistry_5shot_acc_norm": 58.13, "cmmlu-high_school_geography_5shot_acc_norm": 83.33, "cmmlu-professional_medicine_5shot_acc_norm": 76.84, "cmmlu-electrical_engineering_5shot_acc_norm": 69.66, "cmmlu-elementary_mathematics_5shot_acc_norm": 60.32, "cmmlu-high_school_psychology_5shot_acc_norm": 84.4, "cmmlu-high_school_statistics_5shot_acc_norm": 60.19, "cmmlu-high_school_us_history_5shot_acc_norm": 88.73, "cmmlu-high_school_mathematics_5shot_acc_norm": 44.07, "cmmlu-professional_accounting_5shot_acc_norm": 60.64, "cmmlu-professional_psychology_5shot_acc_norm": 72.39, "cmmlu-college_computer_science_5shot_acc_norm": 64, "cmmlu-high_school_world_history_5shot_acc_norm": 87.34, "cmmlu-high_school_macroeconomics_5shot_acc_norm": 77.18, "cmmlu-high_school_microeconomics_5shot_acc_norm": 78.99, "cmmlu-high_school_computer_science_5shot_acc_norm": 85, "cmmlu-high_school_european_history_5shot_acc_norm": 81.21, "cmmlu-high_school_government_and_politics_5shot_acc_norm": 90.67 } }
{}
{}
{}
{}
{ "model_name": "01-ai/Yi-6B-Chat", "model_dtype": "float16", "model_size": 0 }
{ "harness-c_arc_challenge": { "acc_norm": 51.96, "acc_stderr": 0, "c_arc_challenge_25shot_acc": 47.01, "c_arc_challenge_25shot_acc_norm": 51.96 }, "harness-c_gsm8k": { "acc": 31.61, "acc_stderr": 0, "c_gsm8k_5shot_acc": 31.61 }, "harness-c_hellaswag": { "acc_norm": 60.3, "acc_stderr": 0, "c_hellaswag_10shot_acc": 45.15, "c_hellaswag_10shot_acc_norm": 60.3 }, "harness-c-sem-v2": { "acc": 75.315, "acc_stderr": 0, "c_sem_v2-LLSRC_5shot_acc": 79.14, "c_sem_v2-SLPWC_5shot_acc": 66.29, "c_sem_v2-SLRFC_5shot_acc": 77.7, "c_sem_v2-SLSRC_5shot_acc": 78.13, "c_sem_v2-LLSRC_5shot_acc_norm": 79.14, "c_sem_v2-SLPWC_5shot_acc_norm": 66.29, "c_sem_v2-SLRFC_5shot_acc_norm": 77.7, "c_sem_v2-SLSRC_5shot_acc_norm": 78.13 }, "harness-c_truthfulqa_mc": { "mc2": 50.67, "acc_stderr": 0, "c_truthfulqa_mc_0shot_mc1": 30.35, "c_truthfulqa_mc_0shot_mc2": 50.67 }, "harness-c_winogrande": { "acc": 62.75, "acc_stderr": 0, "c_winogrande_0shot_acc": 62.75 }, "CLCC-H": { "acc": 0.6656, "acc_stderr": 0 }, "harness-cmmlu": { "acc_norm": 56.6, "acc_stderr": 0, "cmmlu-anatomy_5shot_acc": 51.85, "cmmlu_fullavg_5shot_acc": 56.6, "cmmlu-virology_5shot_acc": 45.78, "cmmlu-astronomy_5shot_acc": 53.95, "cmmlu-marketing_5shot_acc": 81.2, "cmmlu-nutrition_5shot_acc": 66.01, "cmmlu-sociology_5shot_acc": 77.11, "cmmlu-management_5shot_acc": 76.7, "cmmlu-philosophy_5shot_acc": 59.81, "cmmlu-prehistory_5shot_acc": 55.25, "cmmlu-human_aging_5shot_acc": 60.99, "cmmlu-econometrics_5shot_acc": 32.46, "cmmlu-formal_logic_5shot_acc": 37.6, "cmmlu-global_facts_5shot_acc": 36, "cmmlu-jurisprudence_5shot_acc": 76.85, "cmmlu-miscellaneous_5shot_acc": 65.01, "cmmlu-moral_disputes_5shot_acc": 64.16, "cmmlu-business_ethics_5shot_acc": 64, "cmmlu-college_biology_5shot_acc": 50, "cmmlu-college_physics_5shot_acc": 37.25, "cmmlu-human_sexuality_5shot_acc": 66.41, "cmmlu-moral_scenarios_5shot_acc": 28.72, "cmmlu-world_religions_5shot_acc": 60.82, "cmmlu-abstract_algebra_5shot_acc": 30, "cmmlu-college_medicine_5shot_acc": 58.96, "cmmlu-machine_learning_5shot_acc": 36.61, "cmmlu-medical_genetics_5shot_acc": 65, "cmmlu-professional_law_5shot_acc": 40.09, "cmmlu-public_relations_5shot_acc": 57.27, "cmmlu-security_studies_5shot_acc": 65.71, "cmmlu-college_chemistry_5shot_acc": 42, "cmmlu-computer_security_5shot_acc": 62, "cmmlu-international_law_5shot_acc": 76.86, "cmmlu-logical_fallacies_5shot_acc": 59.51, "cmmlu-us_foreign_policy_5shot_acc": 81, "cmmlu-clinical_knowledge_5shot_acc": 59.25, "cmmlu-conceptual_physics_5shot_acc": 51.49, "cmmlu-college_mathematics_5shot_acc": 35, "cmmlu-high_school_biology_5shot_acc": 67.42, "cmmlu-high_school_physics_5shot_acc": 32.45, "cmmlu-high_school_chemistry_5shot_acc": 44.83, "cmmlu-high_school_geography_5shot_acc": 73.23, "cmmlu-professional_medicine_5shot_acc": 54.41, "cmmlu-electrical_engineering_5shot_acc": 60, "cmmlu-elementary_mathematics_5shot_acc": 42.59, "cmmlu-high_school_psychology_5shot_acc": 73.21, "cmmlu-high_school_statistics_5shot_acc": 51.39, "cmmlu-high_school_us_history_5shot_acc": 68.63, "cmmlu-high_school_mathematics_5shot_acc": 28.52, "cmmlu-professional_accounting_5shot_acc": 43.62, "cmmlu-professional_psychology_5shot_acc": 56.21, "cmmlu-college_computer_science_5shot_acc": 49, "cmmlu-high_school_world_history_5shot_acc": 74.68, "cmmlu-high_school_macroeconomics_5shot_acc": 63.33, "cmmlu-high_school_microeconomics_5shot_acc": 65.13, "cmmlu-high_school_computer_science_5shot_acc": 63, "cmmlu-high_school_european_history_5shot_acc": 70.91, "cmmlu-high_school_government_and_politics_5shot_acc": 75.13, "cmmlu-anatomy_5shot_acc_norm": 51.85, "cmmlu_fullavg_5shot_acc_norm": 56.6, "cmmlu-virology_5shot_acc_norm": 45.78, "cmmlu-astronomy_5shot_acc_norm": 53.95, "cmmlu-marketing_5shot_acc_norm": 81.2, "cmmlu-nutrition_5shot_acc_norm": 66.01, "cmmlu-sociology_5shot_acc_norm": 77.11, "cmmlu-management_5shot_acc_norm": 76.7, "cmmlu-philosophy_5shot_acc_norm": 59.81, "cmmlu-prehistory_5shot_acc_norm": 55.25, "cmmlu-human_aging_5shot_acc_norm": 60.99, "cmmlu-econometrics_5shot_acc_norm": 32.46, "cmmlu-formal_logic_5shot_acc_norm": 37.6, "cmmlu-global_facts_5shot_acc_norm": 36, "cmmlu-jurisprudence_5shot_acc_norm": 76.85, "cmmlu-miscellaneous_5shot_acc_norm": 65.01, "cmmlu-moral_disputes_5shot_acc_norm": 64.16, "cmmlu-business_ethics_5shot_acc_norm": 64, "cmmlu-college_biology_5shot_acc_norm": 50, "cmmlu-college_physics_5shot_acc_norm": 37.25, "cmmlu-human_sexuality_5shot_acc_norm": 66.41, "cmmlu-moral_scenarios_5shot_acc_norm": 28.72, "cmmlu-world_religions_5shot_acc_norm": 60.82, "cmmlu-abstract_algebra_5shot_acc_norm": 30, "cmmlu-college_medicine_5shot_acc_norm": 58.96, "cmmlu-machine_learning_5shot_acc_norm": 36.61, "cmmlu-medical_genetics_5shot_acc_norm": 65, "cmmlu-professional_law_5shot_acc_norm": 40.09, "cmmlu-public_relations_5shot_acc_norm": 57.27, "cmmlu-security_studies_5shot_acc_norm": 65.71, "cmmlu-college_chemistry_5shot_acc_norm": 42, "cmmlu-computer_security_5shot_acc_norm": 62, "cmmlu-international_law_5shot_acc_norm": 76.86, "cmmlu-logical_fallacies_5shot_acc_norm": 59.51, "cmmlu-us_foreign_policy_5shot_acc_norm": 81, "cmmlu-clinical_knowledge_5shot_acc_norm": 59.25, "cmmlu-conceptual_physics_5shot_acc_norm": 51.49, "cmmlu-college_mathematics_5shot_acc_norm": 35, "cmmlu-high_school_biology_5shot_acc_norm": 67.42, "cmmlu-high_school_physics_5shot_acc_norm": 32.45, "cmmlu-high_school_chemistry_5shot_acc_norm": 44.83, "cmmlu-high_school_geography_5shot_acc_norm": 73.23, "cmmlu-professional_medicine_5shot_acc_norm": 54.41, "cmmlu-electrical_engineering_5shot_acc_norm": 60, "cmmlu-elementary_mathematics_5shot_acc_norm": 42.59, "cmmlu-high_school_psychology_5shot_acc_norm": 73.21, "cmmlu-high_school_statistics_5shot_acc_norm": 51.39, "cmmlu-high_school_us_history_5shot_acc_norm": 68.63, "cmmlu-high_school_mathematics_5shot_acc_norm": 28.52, "cmmlu-professional_accounting_5shot_acc_norm": 43.62, "cmmlu-professional_psychology_5shot_acc_norm": 56.21, "cmmlu-college_computer_science_5shot_acc_norm": 49, "cmmlu-high_school_world_history_5shot_acc_norm": 74.68, "cmmlu-high_school_macroeconomics_5shot_acc_norm": 63.33, "cmmlu-high_school_microeconomics_5shot_acc_norm": 65.13, "cmmlu-high_school_computer_science_5shot_acc_norm": 63, "cmmlu-high_school_european_history_5shot_acc_norm": 70.91, "cmmlu-high_school_government_and_politics_5shot_acc_norm": 75.13 } }
{}
{}
{}
{}
{ "model_name": "01-ai/Yi-6B", "model_dtype": "float16", "model_size": 0 }
{ "harness-c_arc_challenge": { "acc_norm": 46.76, "acc_stderr": 0, "c_arc_challenge_25shot_acc": 42.49, "c_arc_challenge_25shot_acc_norm": 46.76 }, "harness-c_gsm8k": { "acc": 24.64, "acc_stderr": 0, "c_gsm8k_5shot_acc": 24.64 }, "harness-c_hellaswag": { "acc_norm": 59.03, "acc_stderr": 0, "c_hellaswag_10shot_acc": 43.56, "c_hellaswag_10shot_acc_norm": 59.03 }, "harness-c-sem-v2": { "acc": 74.77000000000001, "acc_stderr": 0, "c_sem_v2-LLSRC_5shot_acc": 77.12, "c_sem_v2-SLPWC_5shot_acc": 67.86, "c_sem_v2-SLRFC_5shot_acc": 74.68, "c_sem_v2-SLSRC_5shot_acc": 79.42, "c_sem_v2-LLSRC_5shot_acc_norm": 77.12, "c_sem_v2-SLPWC_5shot_acc_norm": 67.86, "c_sem_v2-SLRFC_5shot_acc_norm": 74.68, "c_sem_v2-SLSRC_5shot_acc_norm": 79.42 }, "harness-c_truthfulqa_mc": { "mc2": 44.44, "acc_stderr": 0, "c_truthfulqa_mc_0shot_mc1": 23.87, "c_truthfulqa_mc_0shot_mc2": 44.44 }, "harness-c_winogrande": { "acc": 64.56, "acc_stderr": 0, "c_winogrande_0shot_acc": 64.56 }, "harness-cmmlu": { "acc_norm": 56.54, "acc_stderr": 0, "cmmlu-anatomy_5shot_acc": 53.33, "cmmlu_fullavg_5shot_acc": 56.54, "cmmlu-virology_5shot_acc": 46.39, "cmmlu-astronomy_5shot_acc": 54.61, "cmmlu-marketing_5shot_acc": 82.05, "cmmlu-nutrition_5shot_acc": 65.69, "cmmlu-sociology_5shot_acc": 79.1, "cmmlu-management_5shot_acc": 70.87, "cmmlu-philosophy_5shot_acc": 62.38, "cmmlu-prehistory_5shot_acc": 57.72, "cmmlu-human_aging_5shot_acc": 61.88, "cmmlu-econometrics_5shot_acc": 33.33, "cmmlu-formal_logic_5shot_acc": 36.8, "cmmlu-global_facts_5shot_acc": 39, "cmmlu-jurisprudence_5shot_acc": 76.85, "cmmlu-miscellaneous_5shot_acc": 65.13, "cmmlu-moral_disputes_5shot_acc": 64.16, "cmmlu-business_ethics_5shot_acc": 61, "cmmlu-college_biology_5shot_acc": 53.47, "cmmlu-college_physics_5shot_acc": 36.27, "cmmlu-human_sexuality_5shot_acc": 70.23, "cmmlu-moral_scenarios_5shot_acc": 29.5, "cmmlu-world_religions_5shot_acc": 59.06, "cmmlu-abstract_algebra_5shot_acc": 32, "cmmlu-college_medicine_5shot_acc": 58.96, "cmmlu-machine_learning_5shot_acc": 40.18, "cmmlu-medical_genetics_5shot_acc": 65, "cmmlu-professional_law_5shot_acc": 41.46, "cmmlu-public_relations_5shot_acc": 58.18, "cmmlu-security_studies_5shot_acc": 67.35, "cmmlu-college_chemistry_5shot_acc": 40, "cmmlu-computer_security_5shot_acc": 64, "cmmlu-international_law_5shot_acc": 76.03, "cmmlu-logical_fallacies_5shot_acc": 58.28, "cmmlu-us_foreign_policy_5shot_acc": 79, "cmmlu-clinical_knowledge_5shot_acc": 61.51, "cmmlu-conceptual_physics_5shot_acc": 51.06, "cmmlu-college_mathematics_5shot_acc": 35, "cmmlu-high_school_biology_5shot_acc": 65.16, "cmmlu-high_school_physics_5shot_acc": 36.42, "cmmlu-high_school_chemistry_5shot_acc": 44.33, "cmmlu-high_school_geography_5shot_acc": 73.23, "cmmlu-professional_medicine_5shot_acc": 54.78, "cmmlu-electrical_engineering_5shot_acc": 57.93, "cmmlu-elementary_mathematics_5shot_acc": 42.86, "cmmlu-high_school_psychology_5shot_acc": 72.66, "cmmlu-high_school_statistics_5shot_acc": 47.22, "cmmlu-high_school_us_history_5shot_acc": 68.14, "cmmlu-high_school_mathematics_5shot_acc": 26.3, "cmmlu-professional_accounting_5shot_acc": 42.2, "cmmlu-professional_psychology_5shot_acc": 57.19, "cmmlu-college_computer_science_5shot_acc": 43, "cmmlu-high_school_world_history_5shot_acc": 67.51, "cmmlu-high_school_macroeconomics_5shot_acc": 62.82, "cmmlu-high_school_microeconomics_5shot_acc": 67.65, "cmmlu-high_school_computer_science_5shot_acc": 65, "cmmlu-high_school_european_history_5shot_acc": 69.7, "cmmlu-high_school_government_and_politics_5shot_acc": 72.02, "cmmlu-anatomy_5shot_acc_norm": 53.33, "cmmlu_fullavg_5shot_acc_norm": 56.54, "cmmlu-virology_5shot_acc_norm": 46.39, "cmmlu-astronomy_5shot_acc_norm": 54.61, "cmmlu-marketing_5shot_acc_norm": 82.05, "cmmlu-nutrition_5shot_acc_norm": 65.69, "cmmlu-sociology_5shot_acc_norm": 79.1, "cmmlu-management_5shot_acc_norm": 70.87, "cmmlu-philosophy_5shot_acc_norm": 62.38, "cmmlu-prehistory_5shot_acc_norm": 57.72, "cmmlu-human_aging_5shot_acc_norm": 61.88, "cmmlu-econometrics_5shot_acc_norm": 33.33, "cmmlu-formal_logic_5shot_acc_norm": 36.8, "cmmlu-global_facts_5shot_acc_norm": 39, "cmmlu-jurisprudence_5shot_acc_norm": 76.85, "cmmlu-miscellaneous_5shot_acc_norm": 65.13, "cmmlu-moral_disputes_5shot_acc_norm": 64.16, "cmmlu-business_ethics_5shot_acc_norm": 61, "cmmlu-college_biology_5shot_acc_norm": 53.47, "cmmlu-college_physics_5shot_acc_norm": 36.27, "cmmlu-human_sexuality_5shot_acc_norm": 70.23, "cmmlu-moral_scenarios_5shot_acc_norm": 29.5, "cmmlu-world_religions_5shot_acc_norm": 59.06, "cmmlu-abstract_algebra_5shot_acc_norm": 32, "cmmlu-college_medicine_5shot_acc_norm": 58.96, "cmmlu-machine_learning_5shot_acc_norm": 40.18, "cmmlu-medical_genetics_5shot_acc_norm": 65, "cmmlu-professional_law_5shot_acc_norm": 41.46, "cmmlu-public_relations_5shot_acc_norm": 58.18, "cmmlu-security_studies_5shot_acc_norm": 67.35, "cmmlu-college_chemistry_5shot_acc_norm": 40, "cmmlu-computer_security_5shot_acc_norm": 64, "cmmlu-international_law_5shot_acc_norm": 76.03, "cmmlu-logical_fallacies_5shot_acc_norm": 58.28, "cmmlu-us_foreign_policy_5shot_acc_norm": 79, "cmmlu-clinical_knowledge_5shot_acc_norm": 61.51, "cmmlu-conceptual_physics_5shot_acc_norm": 51.06, "cmmlu-college_mathematics_5shot_acc_norm": 35, "cmmlu-high_school_biology_5shot_acc_norm": 65.16, "cmmlu-high_school_physics_5shot_acc_norm": 36.42, "cmmlu-high_school_chemistry_5shot_acc_norm": 44.33, "cmmlu-high_school_geography_5shot_acc_norm": 73.23, "cmmlu-professional_medicine_5shot_acc_norm": 54.78, "cmmlu-electrical_engineering_5shot_acc_norm": 57.93, "cmmlu-elementary_mathematics_5shot_acc_norm": 42.86, "cmmlu-high_school_psychology_5shot_acc_norm": 72.66, "cmmlu-high_school_statistics_5shot_acc_norm": 47.22, "cmmlu-high_school_us_history_5shot_acc_norm": 68.14, "cmmlu-high_school_mathematics_5shot_acc_norm": 26.3, "cmmlu-professional_accounting_5shot_acc_norm": 42.2, "cmmlu-professional_psychology_5shot_acc_norm": 57.19, "cmmlu-college_computer_science_5shot_acc_norm": 43, "cmmlu-high_school_world_history_5shot_acc_norm": 67.51, "cmmlu-high_school_macroeconomics_5shot_acc_norm": 62.82, "cmmlu-high_school_microeconomics_5shot_acc_norm": 67.65, "cmmlu-high_school_computer_science_5shot_acc_norm": 65, "cmmlu-high_school_european_history_5shot_acc_norm": 69.7, "cmmlu-high_school_government_and_politics_5shot_acc_norm": 72.02 } }
{}
{}
{}
{}
{ "model_name": "01-ai/Yi-9B-200K", "model_dtype": "float16", "model_size": 0 }
{ "harness-c_arc_challenge": { "acc_norm": 52.56, "acc_stderr": 0, "c_arc_challenge_25shot_acc": 47.53, "c_arc_challenge_25shot_acc_norm": 52.56 }, "harness-c_gsm8k": { "acc": 39.88, "acc_stderr": 0, "c_gsm8k_5shot_acc": 39.88 }, "harness-c_hellaswag": { "acc_norm": 62.01, "acc_stderr": 0, "c_hellaswag_10shot_acc": 44.93, "c_hellaswag_10shot_acc_norm": 62.01 }, "harness-c-sem-v2": { "acc": 82.925, "acc_stderr": 0, "c_sem_v2-LLSRC_5shot_acc": 88.35, "c_sem_v2-SLPWC_5shot_acc": 74.86, "c_sem_v2-SLRFC_5shot_acc": 84.17, "c_sem_v2-SLSRC_5shot_acc": 84.32, "c_sem_v2-LLSRC_5shot_acc_norm": 88.35, "c_sem_v2-SLPWC_5shot_acc_norm": 74.86, "c_sem_v2-SLRFC_5shot_acc_norm": 84.17, "c_sem_v2-SLSRC_5shot_acc_norm": 84.32 }, "harness-c_truthfulqa_mc": { "mc2": 43.18, "acc_stderr": 0, "c_truthfulqa_mc_0shot_mc1": 22.89, "c_truthfulqa_mc_0shot_mc2": 43.18 }, "harness-c_winogrande": { "acc": 63.14, "acc_stderr": 0, "c_winogrande_0shot_acc": 63.14 }, "harness-cmmlu": { "acc_norm": 62.82, "acc_stderr": 0, "cmmlu-anatomy_5shot_acc": 54.81, "cmmlu_fullavg_5shot_acc": 62.82, "cmmlu-virology_5shot_acc": 46.39, "cmmlu-astronomy_5shot_acc": 71.05, "cmmlu-marketing_5shot_acc": 82.48, "cmmlu-nutrition_5shot_acc": 68.63, "cmmlu-sociology_5shot_acc": 79.6, "cmmlu-management_5shot_acc": 72.82, "cmmlu-philosophy_5shot_acc": 67.52, "cmmlu-prehistory_5shot_acc": 62.04, "cmmlu-human_aging_5shot_acc": 67.26, "cmmlu-econometrics_5shot_acc": 51.75, "cmmlu-formal_logic_5shot_acc": 48.8, "cmmlu-global_facts_5shot_acc": 42, "cmmlu-jurisprudence_5shot_acc": 78.7, "cmmlu-miscellaneous_5shot_acc": 70.24, "cmmlu-moral_disputes_5shot_acc": 65.32, "cmmlu-business_ethics_5shot_acc": 72, "cmmlu-college_biology_5shot_acc": 63.19, "cmmlu-college_physics_5shot_acc": 43.14, "cmmlu-human_sexuality_5shot_acc": 67.18, "cmmlu-moral_scenarios_5shot_acc": 35.98, "cmmlu-world_religions_5shot_acc": 66.08, "cmmlu-abstract_algebra_5shot_acc": 34, "cmmlu-college_medicine_5shot_acc": 66.47, "cmmlu-machine_learning_5shot_acc": 46.43, "cmmlu-medical_genetics_5shot_acc": 69, "cmmlu-professional_law_5shot_acc": 43.42, "cmmlu-public_relations_5shot_acc": 65.45, "cmmlu-security_studies_5shot_acc": 73.47, "cmmlu-college_chemistry_5shot_acc": 53, "cmmlu-computer_security_5shot_acc": 72, "cmmlu-international_law_5shot_acc": 81.82, "cmmlu-logical_fallacies_5shot_acc": 64.42, "cmmlu-us_foreign_policy_5shot_acc": 83, "cmmlu-clinical_knowledge_5shot_acc": 69.43, "cmmlu-conceptual_physics_5shot_acc": 58.72, "cmmlu-college_mathematics_5shot_acc": 45, "cmmlu-high_school_biology_5shot_acc": 72.26, "cmmlu-high_school_physics_5shot_acc": 39.74, "cmmlu-high_school_chemistry_5shot_acc": 51.72, "cmmlu-high_school_geography_5shot_acc": 77.27, "cmmlu-professional_medicine_5shot_acc": 58.46, "cmmlu-electrical_engineering_5shot_acc": 62.76, "cmmlu-elementary_mathematics_5shot_acc": 52.12, "cmmlu-high_school_psychology_5shot_acc": 78.53, "cmmlu-high_school_statistics_5shot_acc": 60.19, "cmmlu-high_school_us_history_5shot_acc": 80.88, "cmmlu-high_school_mathematics_5shot_acc": 42.96, "cmmlu-professional_accounting_5shot_acc": 49.29, "cmmlu-professional_psychology_5shot_acc": 62.58, "cmmlu-college_computer_science_5shot_acc": 47, "cmmlu-high_school_world_history_5shot_acc": 75.11, "cmmlu-high_school_macroeconomics_5shot_acc": 65.64, "cmmlu-high_school_microeconomics_5shot_acc": 72.27, "cmmlu-high_school_computer_science_5shot_acc": 74, "cmmlu-high_school_european_history_5shot_acc": 78.18, "cmmlu-high_school_government_and_politics_5shot_acc": 77.2, "cmmlu-anatomy_5shot_acc_norm": 54.81, "cmmlu_fullavg_5shot_acc_norm": 62.82, "cmmlu-virology_5shot_acc_norm": 46.39, "cmmlu-astronomy_5shot_acc_norm": 71.05, "cmmlu-marketing_5shot_acc_norm": 82.48, "cmmlu-nutrition_5shot_acc_norm": 68.63, "cmmlu-sociology_5shot_acc_norm": 79.6, "cmmlu-management_5shot_acc_norm": 72.82, "cmmlu-philosophy_5shot_acc_norm": 67.52, "cmmlu-prehistory_5shot_acc_norm": 62.04, "cmmlu-human_aging_5shot_acc_norm": 67.26, "cmmlu-econometrics_5shot_acc_norm": 51.75, "cmmlu-formal_logic_5shot_acc_norm": 48.8, "cmmlu-global_facts_5shot_acc_norm": 42, "cmmlu-jurisprudence_5shot_acc_norm": 78.7, "cmmlu-miscellaneous_5shot_acc_norm": 70.24, "cmmlu-moral_disputes_5shot_acc_norm": 65.32, "cmmlu-business_ethics_5shot_acc_norm": 72, "cmmlu-college_biology_5shot_acc_norm": 63.19, "cmmlu-college_physics_5shot_acc_norm": 43.14, "cmmlu-human_sexuality_5shot_acc_norm": 67.18, "cmmlu-moral_scenarios_5shot_acc_norm": 35.98, "cmmlu-world_religions_5shot_acc_norm": 66.08, "cmmlu-abstract_algebra_5shot_acc_norm": 34, "cmmlu-college_medicine_5shot_acc_norm": 66.47, "cmmlu-machine_learning_5shot_acc_norm": 46.43, "cmmlu-medical_genetics_5shot_acc_norm": 69, "cmmlu-professional_law_5shot_acc_norm": 43.42, "cmmlu-public_relations_5shot_acc_norm": 65.45, "cmmlu-security_studies_5shot_acc_norm": 73.47, "cmmlu-college_chemistry_5shot_acc_norm": 53, "cmmlu-computer_security_5shot_acc_norm": 72, "cmmlu-international_law_5shot_acc_norm": 81.82, "cmmlu-logical_fallacies_5shot_acc_norm": 64.42, "cmmlu-us_foreign_policy_5shot_acc_norm": 83, "cmmlu-clinical_knowledge_5shot_acc_norm": 69.43, "cmmlu-conceptual_physics_5shot_acc_norm": 58.72, "cmmlu-college_mathematics_5shot_acc_norm": 45, "cmmlu-high_school_biology_5shot_acc_norm": 72.26, "cmmlu-high_school_physics_5shot_acc_norm": 39.74, "cmmlu-high_school_chemistry_5shot_acc_norm": 51.72, "cmmlu-high_school_geography_5shot_acc_norm": 77.27, "cmmlu-professional_medicine_5shot_acc_norm": 58.46, "cmmlu-electrical_engineering_5shot_acc_norm": 62.76, "cmmlu-elementary_mathematics_5shot_acc_norm": 52.12, "cmmlu-high_school_psychology_5shot_acc_norm": 78.53, "cmmlu-high_school_statistics_5shot_acc_norm": 60.19, "cmmlu-high_school_us_history_5shot_acc_norm": 80.88, "cmmlu-high_school_mathematics_5shot_acc_norm": 42.96, "cmmlu-professional_accounting_5shot_acc_norm": 49.29, "cmmlu-professional_psychology_5shot_acc_norm": 62.58, "cmmlu-college_computer_science_5shot_acc_norm": 47, "cmmlu-high_school_world_history_5shot_acc_norm": 75.11, "cmmlu-high_school_macroeconomics_5shot_acc_norm": 65.64, "cmmlu-high_school_microeconomics_5shot_acc_norm": 72.27, "cmmlu-high_school_computer_science_5shot_acc_norm": 74, "cmmlu-high_school_european_history_5shot_acc_norm": 78.18, "cmmlu-high_school_government_and_politics_5shot_acc_norm": 77.2 } }
{}
{}
{}
{}
{ "model_name": "01-ai/Yi-9B", "model_dtype": "float16", "model_size": 0 }
{ "harness-c_arc_challenge": { "acc_norm": 52.82, "acc_stderr": 0, "c_arc_challenge_25shot_acc": 47.44, "c_arc_challenge_25shot_acc_norm": 52.82 }, "harness-c_gsm8k": { "acc": 38.82, "acc_stderr": 0, "c_gsm8k_5shot_acc": 38.82 }, "harness-c_hellaswag": { "acc_norm": 62.12, "acc_stderr": 0, "c_hellaswag_10shot_acc": 45.25, "c_hellaswag_10shot_acc_norm": 62.12 }, "harness-c-sem-v2": { "acc": 82.41749999999999, "acc_stderr": 0, "c_sem_v2-LLSRC_5shot_acc": 86.91, "c_sem_v2-SLPWC_5shot_acc": 77.86, "c_sem_v2-SLRFC_5shot_acc": 80.58, "c_sem_v2-SLSRC_5shot_acc": 84.32, "c_sem_v2-LLSRC_5shot_acc_norm": 86.91, "c_sem_v2-SLPWC_5shot_acc_norm": 77.86, "c_sem_v2-SLRFC_5shot_acc_norm": 80.58, "c_sem_v2-SLSRC_5shot_acc_norm": 84.32 }, "harness-c_truthfulqa_mc": { "mc2": 46.26, "acc_stderr": 0, "c_truthfulqa_mc_0shot_mc1": 25.09, "c_truthfulqa_mc_0shot_mc2": 46.26 }, "harness-c_winogrande": { "acc": 64.25, "acc_stderr": 0, "c_winogrande_0shot_acc": 64.25 }, "harness-cmmlu": { "acc_norm": 61.66, "acc_stderr": 0, "cmmlu-anatomy_5shot_acc": 48.15, "cmmlu_fullavg_5shot_acc": 61.66, "cmmlu-virology_5shot_acc": 44.58, "cmmlu-astronomy_5shot_acc": 65.79, "cmmlu-marketing_5shot_acc": 83.76, "cmmlu-nutrition_5shot_acc": 68.63, "cmmlu-sociology_5shot_acc": 78.11, "cmmlu-management_5shot_acc": 79.61, "cmmlu-philosophy_5shot_acc": 67.2, "cmmlu-prehistory_5shot_acc": 61.73, "cmmlu-human_aging_5shot_acc": 63.23, "cmmlu-econometrics_5shot_acc": 44.74, "cmmlu-formal_logic_5shot_acc": 44.8, "cmmlu-global_facts_5shot_acc": 41, "cmmlu-jurisprudence_5shot_acc": 75, "cmmlu-miscellaneous_5shot_acc": 68.33, "cmmlu-moral_disputes_5shot_acc": 64.74, "cmmlu-business_ethics_5shot_acc": 70, "cmmlu-college_biology_5shot_acc": 65.28, "cmmlu-college_physics_5shot_acc": 48.04, "cmmlu-human_sexuality_5shot_acc": 67.18, "cmmlu-moral_scenarios_5shot_acc": 32.07, "cmmlu-world_religions_5shot_acc": 69.59, "cmmlu-abstract_algebra_5shot_acc": 26, "cmmlu-college_medicine_5shot_acc": 66.47, "cmmlu-machine_learning_5shot_acc": 46.43, "cmmlu-medical_genetics_5shot_acc": 64, "cmmlu-professional_law_5shot_acc": 42.44, "cmmlu-public_relations_5shot_acc": 63.64, "cmmlu-security_studies_5shot_acc": 71.43, "cmmlu-college_chemistry_5shot_acc": 43, "cmmlu-computer_security_5shot_acc": 69, "cmmlu-international_law_5shot_acc": 80.17, "cmmlu-logical_fallacies_5shot_acc": 63.8, "cmmlu-us_foreign_policy_5shot_acc": 82, "cmmlu-clinical_knowledge_5shot_acc": 65.66, "cmmlu-conceptual_physics_5shot_acc": 56.6, "cmmlu-college_mathematics_5shot_acc": 44, "cmmlu-high_school_biology_5shot_acc": 71.94, "cmmlu-high_school_physics_5shot_acc": 36.42, "cmmlu-high_school_chemistry_5shot_acc": 53.2, "cmmlu-high_school_geography_5shot_acc": 79.29, "cmmlu-professional_medicine_5shot_acc": 58.82, "cmmlu-electrical_engineering_5shot_acc": 64.14, "cmmlu-elementary_mathematics_5shot_acc": 52.38, "cmmlu-high_school_psychology_5shot_acc": 77.98, "cmmlu-high_school_statistics_5shot_acc": 58.33, "cmmlu-high_school_us_history_5shot_acc": 77.45, "cmmlu-high_school_mathematics_5shot_acc": 42.59, "cmmlu-professional_accounting_5shot_acc": 49.65, "cmmlu-professional_psychology_5shot_acc": 58.66, "cmmlu-college_computer_science_5shot_acc": 49, "cmmlu-high_school_world_history_5shot_acc": 77.22, "cmmlu-high_school_macroeconomics_5shot_acc": 70, "cmmlu-high_school_microeconomics_5shot_acc": 73.95, "cmmlu-high_school_computer_science_5shot_acc": 76, "cmmlu-high_school_european_history_5shot_acc": 73.33, "cmmlu-high_school_government_and_politics_5shot_acc": 78.24, "cmmlu-anatomy_5shot_acc_norm": 48.15, "cmmlu_fullavg_5shot_acc_norm": 61.66, "cmmlu-virology_5shot_acc_norm": 44.58, "cmmlu-astronomy_5shot_acc_norm": 65.79, "cmmlu-marketing_5shot_acc_norm": 83.76, "cmmlu-nutrition_5shot_acc_norm": 68.63, "cmmlu-sociology_5shot_acc_norm": 78.11, "cmmlu-management_5shot_acc_norm": 79.61, "cmmlu-philosophy_5shot_acc_norm": 67.2, "cmmlu-prehistory_5shot_acc_norm": 61.73, "cmmlu-human_aging_5shot_acc_norm": 63.23, "cmmlu-econometrics_5shot_acc_norm": 44.74, "cmmlu-formal_logic_5shot_acc_norm": 44.8, "cmmlu-global_facts_5shot_acc_norm": 41, "cmmlu-jurisprudence_5shot_acc_norm": 75, "cmmlu-miscellaneous_5shot_acc_norm": 68.33, "cmmlu-moral_disputes_5shot_acc_norm": 64.74, "cmmlu-business_ethics_5shot_acc_norm": 70, "cmmlu-college_biology_5shot_acc_norm": 65.28, "cmmlu-college_physics_5shot_acc_norm": 48.04, "cmmlu-human_sexuality_5shot_acc_norm": 67.18, "cmmlu-moral_scenarios_5shot_acc_norm": 32.07, "cmmlu-world_religions_5shot_acc_norm": 69.59, "cmmlu-abstract_algebra_5shot_acc_norm": 26, "cmmlu-college_medicine_5shot_acc_norm": 66.47, "cmmlu-machine_learning_5shot_acc_norm": 46.43, "cmmlu-medical_genetics_5shot_acc_norm": 64, "cmmlu-professional_law_5shot_acc_norm": 42.44, "cmmlu-public_relations_5shot_acc_norm": 63.64, "cmmlu-security_studies_5shot_acc_norm": 71.43, "cmmlu-college_chemistry_5shot_acc_norm": 43, "cmmlu-computer_security_5shot_acc_norm": 69, "cmmlu-international_law_5shot_acc_norm": 80.17, "cmmlu-logical_fallacies_5shot_acc_norm": 63.8, "cmmlu-us_foreign_policy_5shot_acc_norm": 82, "cmmlu-clinical_knowledge_5shot_acc_norm": 65.66, "cmmlu-conceptual_physics_5shot_acc_norm": 56.6, "cmmlu-college_mathematics_5shot_acc_norm": 44, "cmmlu-high_school_biology_5shot_acc_norm": 71.94, "cmmlu-high_school_physics_5shot_acc_norm": 36.42, "cmmlu-high_school_chemistry_5shot_acc_norm": 53.2, "cmmlu-high_school_geography_5shot_acc_norm": 79.29, "cmmlu-professional_medicine_5shot_acc_norm": 58.82, "cmmlu-electrical_engineering_5shot_acc_norm": 64.14, "cmmlu-elementary_mathematics_5shot_acc_norm": 52.38, "cmmlu-high_school_psychology_5shot_acc_norm": 77.98, "cmmlu-high_school_statistics_5shot_acc_norm": 58.33, "cmmlu-high_school_us_history_5shot_acc_norm": 77.45, "cmmlu-high_school_mathematics_5shot_acc_norm": 42.59, "cmmlu-professional_accounting_5shot_acc_norm": 49.65, "cmmlu-professional_psychology_5shot_acc_norm": 58.66, "cmmlu-college_computer_science_5shot_acc_norm": 49, "cmmlu-high_school_world_history_5shot_acc_norm": 77.22, "cmmlu-high_school_macroeconomics_5shot_acc_norm": 70, "cmmlu-high_school_microeconomics_5shot_acc_norm": 73.95, "cmmlu-high_school_computer_science_5shot_acc_norm": 76, "cmmlu-high_school_european_history_5shot_acc_norm": 73.33, "cmmlu-high_school_government_and_politics_5shot_acc_norm": 78.24 } }
{}
{}
{}
{}
{ "model_name": "01-ai/Yi-Coder-9B-Chat", "model_dtype": "float16", "model_size": 0 }
{ "harness-c_arc_challenge": { "acc_norm": 44.88, "acc_stderr": 0, "c_arc_challenge_25shot_acc": 40.1, "c_arc_challenge_25shot_acc_norm": 44.88 }, "harness-c_gsm8k": { "acc": 32.75, "acc_stderr": 0, "c_gsm8k_5shot_acc": 32.75 }, "harness-c_hellaswag": { "acc_norm": 54.4, "acc_stderr": 0, "c_hellaswag_10shot_acc": 40.38, "c_hellaswag_10shot_acc_norm": 54.4 }, "harness-c-sem-v2": { "acc": 63.6075, "acc_stderr": 0, "c_sem_v2-LLSRC_5shot_acc": 74.82, "c_sem_v2-SLPWC_5shot_acc": 53.71, "c_sem_v2-SLRFC_5shot_acc": 55.11, "c_sem_v2-SLSRC_5shot_acc": 70.79, "c_sem_v2-LLSRC_5shot_acc_norm": 74.82, "c_sem_v2-SLPWC_5shot_acc_norm": 53.71, "c_sem_v2-SLRFC_5shot_acc_norm": 55.11, "c_sem_v2-SLSRC_5shot_acc_norm": 70.79 }, "harness-c_truthfulqa_mc": { "mc2": 49.36, "acc_stderr": 0, "c_truthfulqa_mc_0shot_mc1": 27.17, "c_truthfulqa_mc_0shot_mc2": 49.36 }, "harness-c_winogrande": { "acc": 56.91, "acc_stderr": 0, "c_winogrande_0shot_acc": 56.91 }, "CLCC-H": { "acc": 0.4363, "acc_stderr": 0 }, "harness-cmmlu": { "acc_norm": 45.67, "acc_stderr": 0, "cmmlu_fullavg_5shot_acc": 45.67, "cmmlu-virology_5shot_acc": 39.76, "cmmlu-sociology_5shot_acc": 57.71, "cmmlu-world_religions_5shot_acc": 43.27, "cmmlu-professional_law_5shot_acc": 30.44, "cmmlu-public_relations_5shot_acc": 45.45, "cmmlu-security_studies_5shot_acc": 57.96, "cmmlu-us_foreign_policy_5shot_acc": 56, "cmmlu-professional_medicine_5shot_acc": 38.24, "cmmlu-professional_psychology_5shot_acc": 42.16, "cmmlu_fullavg_5shot_acc_norm": 45.67, "cmmlu-virology_5shot_acc_norm": 39.76, "cmmlu-sociology_5shot_acc_norm": 57.71, "cmmlu-world_religions_5shot_acc_norm": 43.27, "cmmlu-professional_law_5shot_acc_norm": 30.44, "cmmlu-public_relations_5shot_acc_norm": 45.45, "cmmlu-security_studies_5shot_acc_norm": 57.96, "cmmlu-us_foreign_policy_5shot_acc_norm": 56, "cmmlu-professional_medicine_5shot_acc_norm": 38.24, "cmmlu-professional_psychology_5shot_acc_norm": 42.16 } }
{}
{}
{}
{}
{ "model_name": "AIJUUD/juud-Mistral-7B-dpo", "model_dtype": "float16", "model_size": 0 }
{ "harness-c_arc_challenge": { "acc_norm": 45.9, "acc_stderr": 0, "c_arc_challenge_25shot_acc": 41.04, "c_arc_challenge_25shot_acc_norm": 45.9 }, "harness-c_gsm8k": { "acc": 14.33, "acc_stderr": 0, "c_gsm8k_5shot_acc": 14.33 }, "harness-c_hellaswag": { "acc_norm": 54.84, "acc_stderr": 0, "c_hellaswag_10shot_acc": 41.48, "c_hellaswag_10shot_acc_norm": 54.84 }, "harness-c-sem-v2": { "acc": 59.20249999999999, "acc_stderr": 0, "c_sem_v2-LLSRC_5shot_acc": 56.98, "c_sem_v2-SLPWC_5shot_acc": 63.43, "c_sem_v2-SLRFC_5shot_acc": 47.05, "c_sem_v2-SLSRC_5shot_acc": 69.35, "c_sem_v2-LLSRC_5shot_acc_norm": 56.98, "c_sem_v2-SLPWC_5shot_acc_norm": 63.43, "c_sem_v2-SLRFC_5shot_acc_norm": 47.05, "c_sem_v2-SLSRC_5shot_acc_norm": 69.35 }, "harness-c_truthfulqa_mc": { "mc2": 53.91, "acc_stderr": 0, "c_truthfulqa_mc_0shot_mc1": 31.58, "c_truthfulqa_mc_0shot_mc2": 53.91 }, "harness-c_winogrande": { "acc": 60.38, "acc_stderr": 0, "c_winogrande_0shot_acc": 60.38 }, "CLCC-H": { "acc": 0.5764, "acc_stderr": 0 }, "harness-cmmlu": { "acc_norm": 47.03, "acc_stderr": 0, "cmmlu-anatomy_5shot_acc": 36.3, "cmmlu_fullavg_5shot_acc": 47.03, "cmmlu-virology_5shot_acc": 37.35, "cmmlu-astronomy_5shot_acc": 47.37, "cmmlu-marketing_5shot_acc": 71.37, "cmmlu-nutrition_5shot_acc": 48.37, "cmmlu-sociology_5shot_acc": 66.17, "cmmlu-management_5shot_acc": 56.31, "cmmlu-philosophy_5shot_acc": 46.3, "cmmlu-prehistory_5shot_acc": 47.22, "cmmlu-human_aging_5shot_acc": 52.02, "cmmlu-econometrics_5shot_acc": 38.6, "cmmlu-formal_logic_5shot_acc": 38.4, "cmmlu-global_facts_5shot_acc": 30, "cmmlu-jurisprudence_5shot_acc": 54.63, "cmmlu-miscellaneous_5shot_acc": 50.83, "cmmlu-moral_disputes_5shot_acc": 50.87, "cmmlu-business_ethics_5shot_acc": 55, "cmmlu-college_biology_5shot_acc": 38.19, "cmmlu-college_physics_5shot_acc": 39.22, "cmmlu-human_sexuality_5shot_acc": 45.8, "cmmlu-moral_scenarios_5shot_acc": 22.57, "cmmlu-world_religions_5shot_acc": 43.86, "cmmlu-abstract_algebra_5shot_acc": 29, "cmmlu-college_medicine_5shot_acc": 39.88, "cmmlu-machine_learning_5shot_acc": 38.39, "cmmlu-medical_genetics_5shot_acc": 40, "cmmlu-professional_law_5shot_acc": 36.38, "cmmlu-public_relations_5shot_acc": 55.45, "cmmlu-security_studies_5shot_acc": 62.04, "cmmlu-college_chemistry_5shot_acc": 41, "cmmlu-computer_security_5shot_acc": 57, "cmmlu-international_law_5shot_acc": 70.25, "cmmlu-logical_fallacies_5shot_acc": 49.08, "cmmlu-us_foreign_policy_5shot_acc": 53, "cmmlu-clinical_knowledge_5shot_acc": 50.19, "cmmlu-conceptual_physics_5shot_acc": 48.51, "cmmlu-college_mathematics_5shot_acc": 38, "cmmlu-high_school_biology_5shot_acc": 50.32, "cmmlu-high_school_physics_5shot_acc": 33.11, "cmmlu-high_school_chemistry_5shot_acc": 40.39, "cmmlu-high_school_geography_5shot_acc": 60.61, "cmmlu-professional_medicine_5shot_acc": 35.66, "cmmlu-electrical_engineering_5shot_acc": 51.72, "cmmlu-elementary_mathematics_5shot_acc": 34.92, "cmmlu-high_school_psychology_5shot_acc": 54.68, "cmmlu-high_school_statistics_5shot_acc": 38.43, "cmmlu-high_school_us_history_5shot_acc": 54.41, "cmmlu-high_school_mathematics_5shot_acc": 32.96, "cmmlu-professional_accounting_5shot_acc": 36.17, "cmmlu-professional_psychology_5shot_acc": 42.81, "cmmlu-college_computer_science_5shot_acc": 43, "cmmlu-high_school_world_history_5shot_acc": 66.67, "cmmlu-high_school_macroeconomics_5shot_acc": 51.03, "cmmlu-high_school_microeconomics_5shot_acc": 44.12, "cmmlu-high_school_computer_science_5shot_acc": 67, "cmmlu-high_school_european_history_5shot_acc": 55.76, "cmmlu-high_school_government_and_politics_5shot_acc": 62.18, "cmmlu-anatomy_5shot_acc_norm": 36.3, "cmmlu_fullavg_5shot_acc_norm": 47.03, "cmmlu-virology_5shot_acc_norm": 37.35, "cmmlu-astronomy_5shot_acc_norm": 47.37, "cmmlu-marketing_5shot_acc_norm": 71.37, "cmmlu-nutrition_5shot_acc_norm": 48.37, "cmmlu-sociology_5shot_acc_norm": 66.17, "cmmlu-management_5shot_acc_norm": 56.31, "cmmlu-philosophy_5shot_acc_norm": 46.3, "cmmlu-prehistory_5shot_acc_norm": 47.22, "cmmlu-human_aging_5shot_acc_norm": 52.02, "cmmlu-econometrics_5shot_acc_norm": 38.6, "cmmlu-formal_logic_5shot_acc_norm": 38.4, "cmmlu-global_facts_5shot_acc_norm": 30, "cmmlu-jurisprudence_5shot_acc_norm": 54.63, "cmmlu-miscellaneous_5shot_acc_norm": 50.83, "cmmlu-moral_disputes_5shot_acc_norm": 50.87, "cmmlu-business_ethics_5shot_acc_norm": 55, "cmmlu-college_biology_5shot_acc_norm": 38.19, "cmmlu-college_physics_5shot_acc_norm": 39.22, "cmmlu-human_sexuality_5shot_acc_norm": 45.8, "cmmlu-moral_scenarios_5shot_acc_norm": 22.57, "cmmlu-world_religions_5shot_acc_norm": 43.86, "cmmlu-abstract_algebra_5shot_acc_norm": 29, "cmmlu-college_medicine_5shot_acc_norm": 39.88, "cmmlu-machine_learning_5shot_acc_norm": 38.39, "cmmlu-medical_genetics_5shot_acc_norm": 40, "cmmlu-professional_law_5shot_acc_norm": 36.38, "cmmlu-public_relations_5shot_acc_norm": 55.45, "cmmlu-security_studies_5shot_acc_norm": 62.04, "cmmlu-college_chemistry_5shot_acc_norm": 41, "cmmlu-computer_security_5shot_acc_norm": 57, "cmmlu-international_law_5shot_acc_norm": 70.25, "cmmlu-logical_fallacies_5shot_acc_norm": 49.08, "cmmlu-us_foreign_policy_5shot_acc_norm": 53, "cmmlu-clinical_knowledge_5shot_acc_norm": 50.19, "cmmlu-conceptual_physics_5shot_acc_norm": 48.51, "cmmlu-college_mathematics_5shot_acc_norm": 38, "cmmlu-high_school_biology_5shot_acc_norm": 50.32, "cmmlu-high_school_physics_5shot_acc_norm": 33.11, "cmmlu-high_school_chemistry_5shot_acc_norm": 40.39, "cmmlu-high_school_geography_5shot_acc_norm": 60.61, "cmmlu-professional_medicine_5shot_acc_norm": 35.66, "cmmlu-electrical_engineering_5shot_acc_norm": 51.72, "cmmlu-elementary_mathematics_5shot_acc_norm": 34.92, "cmmlu-high_school_psychology_5shot_acc_norm": 54.68, "cmmlu-high_school_statistics_5shot_acc_norm": 38.43, "cmmlu-high_school_us_history_5shot_acc_norm": 54.41, "cmmlu-high_school_mathematics_5shot_acc_norm": 32.96, "cmmlu-professional_accounting_5shot_acc_norm": 36.17, "cmmlu-professional_psychology_5shot_acc_norm": 42.81, "cmmlu-college_computer_science_5shot_acc_norm": 43, "cmmlu-high_school_world_history_5shot_acc_norm": 66.67, "cmmlu-high_school_macroeconomics_5shot_acc_norm": 51.03, "cmmlu-high_school_microeconomics_5shot_acc_norm": 44.12, "cmmlu-high_school_computer_science_5shot_acc_norm": 67, "cmmlu-high_school_european_history_5shot_acc_norm": 55.76, "cmmlu-high_school_government_and_politics_5shot_acc_norm": 62.18 } }
{}
{}
{}
{}
{ "model_name": "Artples/L-MChat-7b", "model_dtype": "float16", "model_size": 0 }
{ "harness-c_arc_challenge": { "acc_norm": 47.35, "acc_stderr": 0, "c_arc_challenge_25shot_acc": 41.21, "c_arc_challenge_25shot_acc_norm": 47.35 }, "harness-c_gsm8k": { "acc": 42.08, "acc_stderr": 0, "c_gsm8k_5shot_acc": 42.08 }, "harness-c_hellaswag": { "acc_norm": 54.08, "acc_stderr": 0, "c_hellaswag_10shot_acc": 41.13, "c_hellaswag_10shot_acc_norm": 54.08 }, "harness-c-sem-v2": { "acc": 57.875, "acc_stderr": 0, "c_sem_v2-LLSRC_5shot_acc": 54.1, "c_sem_v2-SLPWC_5shot_acc": 61.14, "c_sem_v2-SLRFC_5shot_acc": 46.33, "c_sem_v2-SLSRC_5shot_acc": 69.93, "c_sem_v2-LLSRC_5shot_acc_norm": 54.1, "c_sem_v2-SLPWC_5shot_acc_norm": 61.14, "c_sem_v2-SLRFC_5shot_acc_norm": 46.33, "c_sem_v2-SLSRC_5shot_acc_norm": 69.93 }, "harness-c_truthfulqa_mc": { "mc2": 53.26, "acc_stderr": 0, "c_truthfulqa_mc_0shot_mc1": 30.72, "c_truthfulqa_mc_0shot_mc2": 53.26 }, "harness-c_winogrande": { "acc": 60.54, "acc_stderr": 0, "c_winogrande_0shot_acc": 60.54 }, "CLCC-H": { "acc": 0.6354, "acc_stderr": 0 }, "harness-cmmlu": { "acc_norm": 47.7, "acc_stderr": 0, "cmmlu-anatomy_5shot_acc": 40, "cmmlu_fullavg_5shot_acc": 47.7, "cmmlu-virology_5shot_acc": 40.36, "cmmlu-astronomy_5shot_acc": 51.32, "cmmlu-marketing_5shot_acc": 69.66, "cmmlu-nutrition_5shot_acc": 50, "cmmlu-sociology_5shot_acc": 64.68, "cmmlu-management_5shot_acc": 61.17, "cmmlu-philosophy_5shot_acc": 50.48, "cmmlu-prehistory_5shot_acc": 41.36, "cmmlu-human_aging_5shot_acc": 46.19, "cmmlu-econometrics_5shot_acc": 35.96, "cmmlu-formal_logic_5shot_acc": 36, "cmmlu-global_facts_5shot_acc": 33, "cmmlu-jurisprudence_5shot_acc": 62.96, "cmmlu-miscellaneous_5shot_acc": 47.64, "cmmlu-moral_disputes_5shot_acc": 56.07, "cmmlu-business_ethics_5shot_acc": 51, "cmmlu-college_biology_5shot_acc": 38.89, "cmmlu-college_physics_5shot_acc": 29.41, "cmmlu-human_sexuality_5shot_acc": 49.62, "cmmlu-moral_scenarios_5shot_acc": 21.34, "cmmlu-world_religions_5shot_acc": 43.27, "cmmlu-abstract_algebra_5shot_acc": 28, "cmmlu-college_medicine_5shot_acc": 43.35, "cmmlu-machine_learning_5shot_acc": 45.54, "cmmlu-medical_genetics_5shot_acc": 46, "cmmlu-professional_law_5shot_acc": 36.18, "cmmlu-public_relations_5shot_acc": 50.91, "cmmlu-security_studies_5shot_acc": 62.45, "cmmlu-college_chemistry_5shot_acc": 43, "cmmlu-computer_security_5shot_acc": 60, "cmmlu-international_law_5shot_acc": 68.6, "cmmlu-logical_fallacies_5shot_acc": 48.47, "cmmlu-us_foreign_policy_5shot_acc": 59, "cmmlu-clinical_knowledge_5shot_acc": 49.81, "cmmlu-conceptual_physics_5shot_acc": 42.13, "cmmlu-college_mathematics_5shot_acc": 30, "cmmlu-high_school_biology_5shot_acc": 50.97, "cmmlu-high_school_physics_5shot_acc": 35.1, "cmmlu-high_school_chemistry_5shot_acc": 37.93, "cmmlu-high_school_geography_5shot_acc": 55.56, "cmmlu-professional_medicine_5shot_acc": 43.01, "cmmlu-electrical_engineering_5shot_acc": 47.59, "cmmlu-elementary_mathematics_5shot_acc": 35.71, "cmmlu-high_school_psychology_5shot_acc": 55.41, "cmmlu-high_school_statistics_5shot_acc": 50, "cmmlu-high_school_us_history_5shot_acc": 53.92, "cmmlu-high_school_mathematics_5shot_acc": 33.33, "cmmlu-professional_accounting_5shot_acc": 36.52, "cmmlu-professional_psychology_5shot_acc": 46.57, "cmmlu-college_computer_science_5shot_acc": 48, "cmmlu-high_school_world_history_5shot_acc": 63.29, "cmmlu-high_school_macroeconomics_5shot_acc": 50.26, "cmmlu-high_school_microeconomics_5shot_acc": 53.36, "cmmlu-high_school_computer_science_5shot_acc": 65, "cmmlu-high_school_european_history_5shot_acc": 64.24, "cmmlu-high_school_government_and_politics_5shot_acc": 59.07, "cmmlu-anatomy_5shot_acc_norm": 40, "cmmlu_fullavg_5shot_acc_norm": 47.7, "cmmlu-virology_5shot_acc_norm": 40.36, "cmmlu-astronomy_5shot_acc_norm": 51.32, "cmmlu-marketing_5shot_acc_norm": 69.66, "cmmlu-nutrition_5shot_acc_norm": 50, "cmmlu-sociology_5shot_acc_norm": 64.68, "cmmlu-management_5shot_acc_norm": 61.17, "cmmlu-philosophy_5shot_acc_norm": 50.48, "cmmlu-prehistory_5shot_acc_norm": 41.36, "cmmlu-human_aging_5shot_acc_norm": 46.19, "cmmlu-econometrics_5shot_acc_norm": 35.96, "cmmlu-formal_logic_5shot_acc_norm": 36, "cmmlu-global_facts_5shot_acc_norm": 33, "cmmlu-jurisprudence_5shot_acc_norm": 62.96, "cmmlu-miscellaneous_5shot_acc_norm": 47.64, "cmmlu-moral_disputes_5shot_acc_norm": 56.07, "cmmlu-business_ethics_5shot_acc_norm": 51, "cmmlu-college_biology_5shot_acc_norm": 38.89, "cmmlu-college_physics_5shot_acc_norm": 29.41, "cmmlu-human_sexuality_5shot_acc_norm": 49.62, "cmmlu-moral_scenarios_5shot_acc_norm": 21.34, "cmmlu-world_religions_5shot_acc_norm": 43.27, "cmmlu-abstract_algebra_5shot_acc_norm": 28, "cmmlu-college_medicine_5shot_acc_norm": 43.35, "cmmlu-machine_learning_5shot_acc_norm": 45.54, "cmmlu-medical_genetics_5shot_acc_norm": 46, "cmmlu-professional_law_5shot_acc_norm": 36.18, "cmmlu-public_relations_5shot_acc_norm": 50.91, "cmmlu-security_studies_5shot_acc_norm": 62.45, "cmmlu-college_chemistry_5shot_acc_norm": 43, "cmmlu-computer_security_5shot_acc_norm": 60, "cmmlu-international_law_5shot_acc_norm": 68.6, "cmmlu-logical_fallacies_5shot_acc_norm": 48.47, "cmmlu-us_foreign_policy_5shot_acc_norm": 59, "cmmlu-clinical_knowledge_5shot_acc_norm": 49.81, "cmmlu-conceptual_physics_5shot_acc_norm": 42.13, "cmmlu-college_mathematics_5shot_acc_norm": 30, "cmmlu-high_school_biology_5shot_acc_norm": 50.97, "cmmlu-high_school_physics_5shot_acc_norm": 35.1, "cmmlu-high_school_chemistry_5shot_acc_norm": 37.93, "cmmlu-high_school_geography_5shot_acc_norm": 55.56, "cmmlu-professional_medicine_5shot_acc_norm": 43.01, "cmmlu-electrical_engineering_5shot_acc_norm": 47.59, "cmmlu-elementary_mathematics_5shot_acc_norm": 35.71, "cmmlu-high_school_psychology_5shot_acc_norm": 55.41, "cmmlu-high_school_statistics_5shot_acc_norm": 50, "cmmlu-high_school_us_history_5shot_acc_norm": 53.92, "cmmlu-high_school_mathematics_5shot_acc_norm": 33.33, "cmmlu-professional_accounting_5shot_acc_norm": 36.52, "cmmlu-professional_psychology_5shot_acc_norm": 46.57, "cmmlu-college_computer_science_5shot_acc_norm": 48, "cmmlu-high_school_world_history_5shot_acc_norm": 63.29, "cmmlu-high_school_macroeconomics_5shot_acc_norm": 50.26, "cmmlu-high_school_microeconomics_5shot_acc_norm": 53.36, "cmmlu-high_school_computer_science_5shot_acc_norm": 65, "cmmlu-high_school_european_history_5shot_acc_norm": 64.24, "cmmlu-high_school_government_and_politics_5shot_acc_norm": 59.07 } }
{}
{}
{}
{}
{ "model_name": "Artples/L-MChat-Small", "model_dtype": "float16", "model_size": 0 }
{ "harness-c_arc_challenge": { "acc_norm": 28.24, "acc_stderr": 0, "c_arc_challenge_25shot_acc": 25, "c_arc_challenge_25shot_acc_norm": 28.24 }, "harness-c_gsm8k": { "acc": 7.66, "acc_stderr": 0, "c_gsm8k_5shot_acc": 7.66 }, "harness-c_hellaswag": { "acc_norm": 31.67, "acc_stderr": 0, "c_hellaswag_10shot_acc": 29.2, "c_hellaswag_10shot_acc_norm": 31.67 }, "harness-c-sem-v2": { "acc": 33.875, "acc_stderr": 0, "c_sem_v2-LLSRC_5shot_acc": 36.4, "c_sem_v2-SLPWC_5shot_acc": 24.14, "c_sem_v2-SLRFC_5shot_acc": 34.24, "c_sem_v2-SLSRC_5shot_acc": 40.72, "c_sem_v2-LLSRC_5shot_acc_norm": 36.4, "c_sem_v2-SLPWC_5shot_acc_norm": 24.14, "c_sem_v2-SLRFC_5shot_acc_norm": 34.24, "c_sem_v2-SLSRC_5shot_acc_norm": 40.72 }, "harness-c_truthfulqa_mc": { "mc2": 49.57, "acc_stderr": 0, "c_truthfulqa_mc_0shot_mc1": 26.81, "c_truthfulqa_mc_0shot_mc2": 49.57 }, "harness-c_winogrande": { "acc": 51.7, "acc_stderr": 0, "c_winogrande_0shot_acc": 51.7 }, "CLCC-H": { "acc": 0.2373, "acc_stderr": 0 }, "harness-cmmlu": { "acc_norm": 33.06, "acc_stderr": 0, "cmmlu-anatomy_5shot_acc": 32.59, "cmmlu_fullavg_5shot_acc": 33.06, "cmmlu-virology_5shot_acc": 33.13, "cmmlu-astronomy_5shot_acc": 30.26, "cmmlu-marketing_5shot_acc": 45.73, "cmmlu-nutrition_5shot_acc": 31.05, "cmmlu-sociology_5shot_acc": 42.79, "cmmlu-management_5shot_acc": 26.21, "cmmlu-philosophy_5shot_acc": 32.48, "cmmlu-prehistory_5shot_acc": 34.88, "cmmlu-human_aging_5shot_acc": 35.43, "cmmlu-econometrics_5shot_acc": 29.82, "cmmlu-formal_logic_5shot_acc": 31.2, "cmmlu-global_facts_5shot_acc": 34, "cmmlu-jurisprudence_5shot_acc": 33.33, "cmmlu-miscellaneous_5shot_acc": 32.31, "cmmlu-moral_disputes_5shot_acc": 33.24, "cmmlu-business_ethics_5shot_acc": 27, "cmmlu-college_biology_5shot_acc": 29.86, "cmmlu-college_physics_5shot_acc": 18.63, "cmmlu-human_sexuality_5shot_acc": 43.51, "cmmlu-moral_scenarios_5shot_acc": 23.58, "cmmlu-world_religions_5shot_acc": 34.5, "cmmlu-abstract_algebra_5shot_acc": 28, "cmmlu-college_medicine_5shot_acc": 28.32, "cmmlu-machine_learning_5shot_acc": 26.79, "cmmlu-medical_genetics_5shot_acc": 42, "cmmlu-professional_law_5shot_acc": 28.42, "cmmlu-public_relations_5shot_acc": 36.36, "cmmlu-security_studies_5shot_acc": 31.43, "cmmlu-college_chemistry_5shot_acc": 21, "cmmlu-computer_security_5shot_acc": 43, "cmmlu-international_law_5shot_acc": 52.89, "cmmlu-logical_fallacies_5shot_acc": 33.74, "cmmlu-us_foreign_policy_5shot_acc": 52, "cmmlu-clinical_knowledge_5shot_acc": 35.09, "cmmlu-conceptual_physics_5shot_acc": 34.89, "cmmlu-college_mathematics_5shot_acc": 30, "cmmlu-high_school_biology_5shot_acc": 38.06, "cmmlu-high_school_physics_5shot_acc": 29.14, "cmmlu-high_school_chemistry_5shot_acc": 34.48, "cmmlu-high_school_geography_5shot_acc": 30.81, "cmmlu-professional_medicine_5shot_acc": 19.49, "cmmlu-electrical_engineering_5shot_acc": 42.76, "cmmlu-elementary_mathematics_5shot_acc": 25.13, "cmmlu-high_school_psychology_5shot_acc": 32.11, "cmmlu-high_school_statistics_5shot_acc": 29.63, "cmmlu-high_school_us_history_5shot_acc": 30.88, "cmmlu-high_school_mathematics_5shot_acc": 24.81, "cmmlu-professional_accounting_5shot_acc": 29.79, "cmmlu-professional_psychology_5shot_acc": 29.58, "cmmlu-college_computer_science_5shot_acc": 35, "cmmlu-high_school_world_history_5shot_acc": 36.71, "cmmlu-high_school_macroeconomics_5shot_acc": 33.85, "cmmlu-high_school_microeconomics_5shot_acc": 34.45, "cmmlu-high_school_computer_science_5shot_acc": 40, "cmmlu-high_school_european_history_5shot_acc": 32.12, "cmmlu-high_school_government_and_politics_5shot_acc": 36.27, "cmmlu-anatomy_5shot_acc_norm": 32.59, "cmmlu_fullavg_5shot_acc_norm": 33.06, "cmmlu-virology_5shot_acc_norm": 33.13, "cmmlu-astronomy_5shot_acc_norm": 30.26, "cmmlu-marketing_5shot_acc_norm": 45.73, "cmmlu-nutrition_5shot_acc_norm": 31.05, "cmmlu-sociology_5shot_acc_norm": 42.79, "cmmlu-management_5shot_acc_norm": 26.21, "cmmlu-philosophy_5shot_acc_norm": 32.48, "cmmlu-prehistory_5shot_acc_norm": 34.88, "cmmlu-human_aging_5shot_acc_norm": 35.43, "cmmlu-econometrics_5shot_acc_norm": 29.82, "cmmlu-formal_logic_5shot_acc_norm": 31.2, "cmmlu-global_facts_5shot_acc_norm": 34, "cmmlu-jurisprudence_5shot_acc_norm": 33.33, "cmmlu-miscellaneous_5shot_acc_norm": 32.31, "cmmlu-moral_disputes_5shot_acc_norm": 33.24, "cmmlu-business_ethics_5shot_acc_norm": 27, "cmmlu-college_biology_5shot_acc_norm": 29.86, "cmmlu-college_physics_5shot_acc_norm": 18.63, "cmmlu-human_sexuality_5shot_acc_norm": 43.51, "cmmlu-moral_scenarios_5shot_acc_norm": 23.58, "cmmlu-world_religions_5shot_acc_norm": 34.5, "cmmlu-abstract_algebra_5shot_acc_norm": 28, "cmmlu-college_medicine_5shot_acc_norm": 28.32, "cmmlu-machine_learning_5shot_acc_norm": 26.79, "cmmlu-medical_genetics_5shot_acc_norm": 42, "cmmlu-professional_law_5shot_acc_norm": 28.42, "cmmlu-public_relations_5shot_acc_norm": 36.36, "cmmlu-security_studies_5shot_acc_norm": 31.43, "cmmlu-college_chemistry_5shot_acc_norm": 21, "cmmlu-computer_security_5shot_acc_norm": 43, "cmmlu-international_law_5shot_acc_norm": 52.89, "cmmlu-logical_fallacies_5shot_acc_norm": 33.74, "cmmlu-us_foreign_policy_5shot_acc_norm": 52, "cmmlu-clinical_knowledge_5shot_acc_norm": 35.09, "cmmlu-conceptual_physics_5shot_acc_norm": 34.89, "cmmlu-college_mathematics_5shot_acc_norm": 30, "cmmlu-high_school_biology_5shot_acc_norm": 38.06, "cmmlu-high_school_physics_5shot_acc_norm": 29.14, "cmmlu-high_school_chemistry_5shot_acc_norm": 34.48, "cmmlu-high_school_geography_5shot_acc_norm": 30.81, "cmmlu-professional_medicine_5shot_acc_norm": 19.49, "cmmlu-electrical_engineering_5shot_acc_norm": 42.76, "cmmlu-elementary_mathematics_5shot_acc_norm": 25.13, "cmmlu-high_school_psychology_5shot_acc_norm": 32.11, "cmmlu-high_school_statistics_5shot_acc_norm": 29.63, "cmmlu-high_school_us_history_5shot_acc_norm": 30.88, "cmmlu-high_school_mathematics_5shot_acc_norm": 24.81, "cmmlu-professional_accounting_5shot_acc_norm": 29.79, "cmmlu-professional_psychology_5shot_acc_norm": 29.58, "cmmlu-college_computer_science_5shot_acc_norm": 35, "cmmlu-high_school_world_history_5shot_acc_norm": 36.71, "cmmlu-high_school_macroeconomics_5shot_acc_norm": 33.85, "cmmlu-high_school_microeconomics_5shot_acc_norm": 34.45, "cmmlu-high_school_computer_science_5shot_acc_norm": 40, "cmmlu-high_school_european_history_5shot_acc_norm": 32.12, "cmmlu-high_school_government_and_politics_5shot_acc_norm": 36.27 } }
{}
{}
{}
{}
{ "model_name": "Azure99/blossom-v5.1-34b", "model_dtype": "float16", "model_size": 0 }
{ "harness-c_arc_challenge": { "acc_norm": 61.18, "acc_stderr": 0, "c_arc_challenge_25shot_acc": 54.52, "c_arc_challenge_25shot_acc_norm": 61.18 }, "harness-c_gsm8k": { "acc": 67.85, "acc_stderr": 0, "c_gsm8k_5shot_acc": 67.85 }, "harness-c_hellaswag": { "acc_norm": 68.61, "acc_stderr": 0, "c_hellaswag_10shot_acc": 49.84, "c_hellaswag_10shot_acc_norm": 68.61 }, "harness-c-sem-v2": { "acc": 89.735, "acc_stderr": 0, "c_sem_v2-LLSRC_5shot_acc": 93.38, "c_sem_v2-SLPWC_5shot_acc": 85.71, "c_sem_v2-SLRFC_5shot_acc": 92.66, "c_sem_v2-SLSRC_5shot_acc": 87.19, "c_sem_v2-LLSRC_5shot_acc_norm": 93.38, "c_sem_v2-SLPWC_5shot_acc_norm": 85.71, "c_sem_v2-SLRFC_5shot_acc_norm": 92.66, "c_sem_v2-SLSRC_5shot_acc_norm": 87.19 }, "harness-c_truthfulqa_mc": { "mc2": 54.47, "acc_stderr": 0, "c_truthfulqa_mc_0shot_mc1": 31.82, "c_truthfulqa_mc_0shot_mc2": 54.47 }, "harness-c_winogrande": { "acc": 68.51, "acc_stderr": 0, "c_winogrande_0shot_acc": 68.51 }, "CLCC-H": { "acc": 0, "acc_stderr": 0 }, "harness-cmmlu": { "acc_norm": 70.37, "acc_stderr": 0, "cmmlu-anatomy_5shot_acc": 60, "cmmlu_fullavg_5shot_acc": 70.37, "cmmlu-virology_5shot_acc": 52.41, "cmmlu-astronomy_5shot_acc": 78.95, "cmmlu-marketing_5shot_acc": 87.61, "cmmlu-nutrition_5shot_acc": 77.78, "cmmlu-sociology_5shot_acc": 83.08, "cmmlu-management_5shot_acc": 78.64, "cmmlu-philosophy_5shot_acc": 71.38, "cmmlu-prehistory_5shot_acc": 71.6, "cmmlu-human_aging_5shot_acc": 70.4, "cmmlu-econometrics_5shot_acc": 62.28, "cmmlu-formal_logic_5shot_acc": 53.6, "cmmlu-global_facts_5shot_acc": 51, "cmmlu-jurisprudence_5shot_acc": 80.56, "cmmlu-miscellaneous_5shot_acc": 78.67, "cmmlu-moral_disputes_5shot_acc": 73.99, "cmmlu-business_ethics_5shot_acc": 73, "cmmlu-college_biology_5shot_acc": 69.44, "cmmlu-college_physics_5shot_acc": 51.96, "cmmlu-human_sexuality_5shot_acc": 74.05, "cmmlu-moral_scenarios_5shot_acc": 57.32, "cmmlu-world_religions_5shot_acc": 72.51, "cmmlu-abstract_algebra_5shot_acc": 43, "cmmlu-college_medicine_5shot_acc": 69.94, "cmmlu-machine_learning_5shot_acc": 58.93, "cmmlu-medical_genetics_5shot_acc": 68, "cmmlu-professional_law_5shot_acc": 51.3, "cmmlu-public_relations_5shot_acc": 67.27, "cmmlu-security_studies_5shot_acc": 76.73, "cmmlu-college_chemistry_5shot_acc": 48, "cmmlu-computer_security_5shot_acc": 78, "cmmlu-international_law_5shot_acc": 89.26, "cmmlu-logical_fallacies_5shot_acc": 73.01, "cmmlu-us_foreign_policy_5shot_acc": 85, "cmmlu-clinical_knowledge_5shot_acc": 74.34, "cmmlu-conceptual_physics_5shot_acc": 73.19, "cmmlu-college_mathematics_5shot_acc": 42, "cmmlu-high_school_biology_5shot_acc": 83.55, "cmmlu-high_school_physics_5shot_acc": 58.28, "cmmlu-high_school_chemistry_5shot_acc": 60.59, "cmmlu-high_school_geography_5shot_acc": 82.83, "cmmlu-professional_medicine_5shot_acc": 73.16, "cmmlu-electrical_engineering_5shot_acc": 71.72, "cmmlu-elementary_mathematics_5shot_acc": 69.05, "cmmlu-high_school_psychology_5shot_acc": 84.04, "cmmlu-high_school_statistics_5shot_acc": 68.06, "cmmlu-high_school_us_history_5shot_acc": 90.2, "cmmlu-high_school_mathematics_5shot_acc": 49.26, "cmmlu-professional_accounting_5shot_acc": 54.61, "cmmlu-professional_psychology_5shot_acc": 69.93, "cmmlu-college_computer_science_5shot_acc": 63, "cmmlu-high_school_world_history_5shot_acc": 84.81, "cmmlu-high_school_macroeconomics_5shot_acc": 77.95, "cmmlu-high_school_microeconomics_5shot_acc": 87.39, "cmmlu-high_school_computer_science_5shot_acc": 83, "cmmlu-high_school_european_history_5shot_acc": 81.21, "cmmlu-high_school_government_and_politics_5shot_acc": 90.16, "cmmlu-anatomy_5shot_acc_norm": 60, "cmmlu_fullavg_5shot_acc_norm": 70.37, "cmmlu-virology_5shot_acc_norm": 52.41, "cmmlu-astronomy_5shot_acc_norm": 78.95, "cmmlu-marketing_5shot_acc_norm": 87.61, "cmmlu-nutrition_5shot_acc_norm": 77.78, "cmmlu-sociology_5shot_acc_norm": 83.08, "cmmlu-management_5shot_acc_norm": 78.64, "cmmlu-philosophy_5shot_acc_norm": 71.38, "cmmlu-prehistory_5shot_acc_norm": 71.6, "cmmlu-human_aging_5shot_acc_norm": 70.4, "cmmlu-econometrics_5shot_acc_norm": 62.28, "cmmlu-formal_logic_5shot_acc_norm": 53.6, "cmmlu-global_facts_5shot_acc_norm": 51, "cmmlu-jurisprudence_5shot_acc_norm": 80.56, "cmmlu-miscellaneous_5shot_acc_norm": 78.67, "cmmlu-moral_disputes_5shot_acc_norm": 73.99, "cmmlu-business_ethics_5shot_acc_norm": 73, "cmmlu-college_biology_5shot_acc_norm": 69.44, "cmmlu-college_physics_5shot_acc_norm": 51.96, "cmmlu-human_sexuality_5shot_acc_norm": 74.05, "cmmlu-moral_scenarios_5shot_acc_norm": 57.32, "cmmlu-world_religions_5shot_acc_norm": 72.51, "cmmlu-abstract_algebra_5shot_acc_norm": 43, "cmmlu-college_medicine_5shot_acc_norm": 69.94, "cmmlu-machine_learning_5shot_acc_norm": 58.93, "cmmlu-medical_genetics_5shot_acc_norm": 68, "cmmlu-professional_law_5shot_acc_norm": 51.3, "cmmlu-public_relations_5shot_acc_norm": 67.27, "cmmlu-security_studies_5shot_acc_norm": 76.73, "cmmlu-college_chemistry_5shot_acc_norm": 48, "cmmlu-computer_security_5shot_acc_norm": 78, "cmmlu-international_law_5shot_acc_norm": 89.26, "cmmlu-logical_fallacies_5shot_acc_norm": 73.01, "cmmlu-us_foreign_policy_5shot_acc_norm": 85, "cmmlu-clinical_knowledge_5shot_acc_norm": 74.34, "cmmlu-conceptual_physics_5shot_acc_norm": 73.19, "cmmlu-college_mathematics_5shot_acc_norm": 42, "cmmlu-high_school_biology_5shot_acc_norm": 83.55, "cmmlu-high_school_physics_5shot_acc_norm": 58.28, "cmmlu-high_school_chemistry_5shot_acc_norm": 60.59, "cmmlu-high_school_geography_5shot_acc_norm": 82.83, "cmmlu-professional_medicine_5shot_acc_norm": 73.16, "cmmlu-electrical_engineering_5shot_acc_norm": 71.72, "cmmlu-elementary_mathematics_5shot_acc_norm": 69.05, "cmmlu-high_school_psychology_5shot_acc_norm": 84.04, "cmmlu-high_school_statistics_5shot_acc_norm": 68.06, "cmmlu-high_school_us_history_5shot_acc_norm": 90.2, "cmmlu-high_school_mathematics_5shot_acc_norm": 49.26, "cmmlu-professional_accounting_5shot_acc_norm": 54.61, "cmmlu-professional_psychology_5shot_acc_norm": 69.93, "cmmlu-college_computer_science_5shot_acc_norm": 63, "cmmlu-high_school_world_history_5shot_acc_norm": 84.81, "cmmlu-high_school_macroeconomics_5shot_acc_norm": 77.95, "cmmlu-high_school_microeconomics_5shot_acc_norm": 87.39, "cmmlu-high_school_computer_science_5shot_acc_norm": 83, "cmmlu-high_school_european_history_5shot_acc_norm": 81.21, "cmmlu-high_school_government_and_politics_5shot_acc_norm": 90.16 } }
{}
{}
{}
{}
{ "model_name": "Azure99/blossom-v5.1-9b", "model_dtype": "float16", "model_size": 0 }
{ "harness-c_arc_challenge": { "acc_norm": 36.86, "acc_stderr": 0, "c_arc_challenge_25shot_acc": 31.31, "c_arc_challenge_25shot_acc_norm": 36.86 }, "harness-c_gsm8k": { "acc": 0, "acc_stderr": 0, "c_gsm8k_5shot_acc": 0 }, "harness-c_hellaswag": { "acc_norm": 41.78, "acc_stderr": 0, "c_hellaswag_10shot_acc": 32.58, "c_hellaswag_10shot_acc_norm": 41.78 }, "harness-c-sem-v2": { "acc": 20.1025, "acc_stderr": 0, "c_sem_v2-LLSRC_5shot_acc": 15.11, "c_sem_v2-SLPWC_5shot_acc": 23.86, "c_sem_v2-SLRFC_5shot_acc": 20.58, "c_sem_v2-SLSRC_5shot_acc": 20.86, "c_sem_v2-LLSRC_5shot_acc_norm": 15.11, "c_sem_v2-SLPWC_5shot_acc_norm": 23.86, "c_sem_v2-SLRFC_5shot_acc_norm": 20.58, "c_sem_v2-SLSRC_5shot_acc_norm": 20.86 }, "harness-c_truthfulqa_mc": { "mc2": 55.63, "acc_stderr": 0, "c_truthfulqa_mc_0shot_mc1": 28.89, "c_truthfulqa_mc_0shot_mc2": 55.63 }, "harness-c_winogrande": { "acc": 57.14, "acc_stderr": 0, "c_winogrande_0shot_acc": 57.14 }, "CLCC-H": { "acc": 0.7787, "acc_stderr": 0 }, "harness-cmmlu": { "acc_norm": 21.85, "acc_stderr": 0, "cmmlu-anatomy_5shot_acc": 15.56, "cmmlu_fullavg_5shot_acc": 21.85, "cmmlu-virology_5shot_acc": 19.28, "cmmlu-astronomy_5shot_acc": 17.76, "cmmlu-marketing_5shot_acc": 23.93, "cmmlu-nutrition_5shot_acc": 19.93, "cmmlu-sociology_5shot_acc": 22.89, "cmmlu-management_5shot_acc": 16.5, "cmmlu-philosophy_5shot_acc": 19.94, "cmmlu-prehistory_5shot_acc": 19.75, "cmmlu-human_aging_5shot_acc": 32.74, "cmmlu-econometrics_5shot_acc": 26.32, "cmmlu-formal_logic_5shot_acc": 26.4, "cmmlu-global_facts_5shot_acc": 18, "cmmlu-jurisprudence_5shot_acc": 19.44, "cmmlu-miscellaneous_5shot_acc": 20.43, "cmmlu-moral_disputes_5shot_acc": 22.83, "cmmlu-business_ethics_5shot_acc": 22, "cmmlu-college_biology_5shot_acc": 22.92, "cmmlu-college_physics_5shot_acc": 15.69, "cmmlu-human_sexuality_5shot_acc": 24.43, "cmmlu-moral_scenarios_5shot_acc": 23.8, "cmmlu-world_religions_5shot_acc": 25.15, "cmmlu-abstract_algebra_5shot_acc": 27, "cmmlu-college_medicine_5shot_acc": 20.23, "cmmlu-machine_learning_5shot_acc": 35.71, "cmmlu-medical_genetics_5shot_acc": 24, "cmmlu-professional_law_5shot_acc": 24.64, "cmmlu-public_relations_5shot_acc": 21.82, "cmmlu-security_studies_5shot_acc": 22.45, "cmmlu-college_chemistry_5shot_acc": 16, "cmmlu-computer_security_5shot_acc": 21, "cmmlu-international_law_5shot_acc": 14.05, "cmmlu-logical_fallacies_5shot_acc": 21.47, "cmmlu-us_foreign_policy_5shot_acc": 26, "cmmlu-clinical_knowledge_5shot_acc": 20, "cmmlu-conceptual_physics_5shot_acc": 26.81, "cmmlu-college_mathematics_5shot_acc": 22, "cmmlu-high_school_biology_5shot_acc": 20.97, "cmmlu-high_school_physics_5shot_acc": 21.19, "cmmlu-high_school_chemistry_5shot_acc": 21.67, "cmmlu-high_school_geography_5shot_acc": 18.18, "cmmlu-professional_medicine_5shot_acc": 17.28, "cmmlu-electrical_engineering_5shot_acc": 24.14, "cmmlu-elementary_mathematics_5shot_acc": 21.96, "cmmlu-high_school_psychology_5shot_acc": 17.61, "cmmlu-high_school_statistics_5shot_acc": 15.28, "cmmlu-high_school_us_history_5shot_acc": 26.96, "cmmlu-high_school_mathematics_5shot_acc": 20.74, "cmmlu-professional_accounting_5shot_acc": 23.05, "cmmlu-professional_psychology_5shot_acc": 24.18, "cmmlu-college_computer_science_5shot_acc": 25, "cmmlu-high_school_world_history_5shot_acc": 27, "cmmlu-high_school_macroeconomics_5shot_acc": 20.26, "cmmlu-high_school_microeconomics_5shot_acc": 20.59, "cmmlu-high_school_computer_science_5shot_acc": 21, "cmmlu-high_school_european_history_5shot_acc": 20.61, "cmmlu-high_school_government_and_politics_5shot_acc": 18.65, "cmmlu-anatomy_5shot_acc_norm": 15.56, "cmmlu_fullavg_5shot_acc_norm": 21.85, "cmmlu-virology_5shot_acc_norm": 19.28, "cmmlu-astronomy_5shot_acc_norm": 17.76, "cmmlu-marketing_5shot_acc_norm": 23.93, "cmmlu-nutrition_5shot_acc_norm": 19.93, "cmmlu-sociology_5shot_acc_norm": 22.89, "cmmlu-management_5shot_acc_norm": 16.5, "cmmlu-philosophy_5shot_acc_norm": 19.94, "cmmlu-prehistory_5shot_acc_norm": 19.75, "cmmlu-human_aging_5shot_acc_norm": 32.74, "cmmlu-econometrics_5shot_acc_norm": 26.32, "cmmlu-formal_logic_5shot_acc_norm": 26.4, "cmmlu-global_facts_5shot_acc_norm": 18, "cmmlu-jurisprudence_5shot_acc_norm": 19.44, "cmmlu-miscellaneous_5shot_acc_norm": 20.43, "cmmlu-moral_disputes_5shot_acc_norm": 22.83, "cmmlu-business_ethics_5shot_acc_norm": 22, "cmmlu-college_biology_5shot_acc_norm": 22.92, "cmmlu-college_physics_5shot_acc_norm": 15.69, "cmmlu-human_sexuality_5shot_acc_norm": 24.43, "cmmlu-moral_scenarios_5shot_acc_norm": 23.8, "cmmlu-world_religions_5shot_acc_norm": 25.15, "cmmlu-abstract_algebra_5shot_acc_norm": 27, "cmmlu-college_medicine_5shot_acc_norm": 20.23, "cmmlu-machine_learning_5shot_acc_norm": 35.71, "cmmlu-medical_genetics_5shot_acc_norm": 24, "cmmlu-professional_law_5shot_acc_norm": 24.64, "cmmlu-public_relations_5shot_acc_norm": 21.82, "cmmlu-security_studies_5shot_acc_norm": 22.45, "cmmlu-college_chemistry_5shot_acc_norm": 16, "cmmlu-computer_security_5shot_acc_norm": 21, "cmmlu-international_law_5shot_acc_norm": 14.05, "cmmlu-logical_fallacies_5shot_acc_norm": 21.47, "cmmlu-us_foreign_policy_5shot_acc_norm": 26, "cmmlu-clinical_knowledge_5shot_acc_norm": 20, "cmmlu-conceptual_physics_5shot_acc_norm": 26.81, "cmmlu-college_mathematics_5shot_acc_norm": 22, "cmmlu-high_school_biology_5shot_acc_norm": 20.97, "cmmlu-high_school_physics_5shot_acc_norm": 21.19, "cmmlu-high_school_chemistry_5shot_acc_norm": 21.67, "cmmlu-high_school_geography_5shot_acc_norm": 18.18, "cmmlu-professional_medicine_5shot_acc_norm": 17.28, "cmmlu-electrical_engineering_5shot_acc_norm": 24.14, "cmmlu-elementary_mathematics_5shot_acc_norm": 21.96, "cmmlu-high_school_psychology_5shot_acc_norm": 17.61, "cmmlu-high_school_statistics_5shot_acc_norm": 15.28, "cmmlu-high_school_us_history_5shot_acc_norm": 26.96, "cmmlu-high_school_mathematics_5shot_acc_norm": 20.74, "cmmlu-professional_accounting_5shot_acc_norm": 23.05, "cmmlu-professional_psychology_5shot_acc_norm": 24.18, "cmmlu-college_computer_science_5shot_acc_norm": 25, "cmmlu-high_school_world_history_5shot_acc_norm": 27, "cmmlu-high_school_macroeconomics_5shot_acc_norm": 20.26, "cmmlu-high_school_microeconomics_5shot_acc_norm": 20.59, "cmmlu-high_school_computer_science_5shot_acc_norm": 21, "cmmlu-high_school_european_history_5shot_acc_norm": 20.61, "cmmlu-high_school_government_and_politics_5shot_acc_norm": 18.65 } }
{}
{}
{}
{}
{ "model_name": "BAAI/Infinity-Instruct-3M-0625-Llama3-8B", "model_dtype": "float16", "model_size": 0 }
{ "harness-c_arc_challenge": { "acc_norm": 47.7, "acc_stderr": 0, "c_arc_challenge_25shot_acc": 42.41, "c_arc_challenge_25shot_acc_norm": 47.7 }, "harness-c_gsm8k": { "acc": 57.62, "acc_stderr": 0, "c_gsm8k_5shot_acc": 57.62 }, "harness-c_hellaswag": { "acc_norm": 58.86, "acc_stderr": 0, "c_hellaswag_10shot_acc": 43.36, "c_hellaswag_10shot_acc_norm": 58.86 }, "harness-c-sem-v2": { "acc": 70.9675, "acc_stderr": 0, "c_sem_v2-LLSRC_5shot_acc": 76.55, "c_sem_v2-SLPWC_5shot_acc": 62.29, "c_sem_v2-SLRFC_5shot_acc": 66.76, "c_sem_v2-SLSRC_5shot_acc": 78.27, "c_sem_v2-LLSRC_5shot_acc_norm": 76.55, "c_sem_v2-SLPWC_5shot_acc_norm": 62.29, "c_sem_v2-SLRFC_5shot_acc_norm": 66.76, "c_sem_v2-SLSRC_5shot_acc_norm": 78.27 }, "harness-c_truthfulqa_mc": { "mc2": 53.02, "acc_stderr": 0, "c_truthfulqa_mc_0shot_mc1": 31.46, "c_truthfulqa_mc_0shot_mc2": 53.02 }, "harness-c_winogrande": { "acc": 62.27, "acc_stderr": 0, "c_winogrande_0shot_acc": 62.27 }, "CLCC-H": { "acc": 0.7436, "acc_stderr": 0 }, "harness-cmmlu": { "acc_norm": 53.81, "acc_stderr": 0, "cmmlu-anatomy_5shot_acc": 47.41, "cmmlu_fullavg_5shot_acc": 53.81, "cmmlu-virology_5shot_acc": 43.98, "cmmlu-astronomy_5shot_acc": 59.87, "cmmlu-marketing_5shot_acc": 76.5, "cmmlu-nutrition_5shot_acc": 59.48, "cmmlu-sociology_5shot_acc": 72.64, "cmmlu-management_5shot_acc": 66.02, "cmmlu-philosophy_5shot_acc": 53.7, "cmmlu-prehistory_5shot_acc": 54.32, "cmmlu-human_aging_5shot_acc": 52.47, "cmmlu-econometrics_5shot_acc": 40.35, "cmmlu-formal_logic_5shot_acc": 45.6, "cmmlu-global_facts_5shot_acc": 41, "cmmlu-jurisprudence_5shot_acc": 59.26, "cmmlu-miscellaneous_5shot_acc": 59.9, "cmmlu-moral_disputes_5shot_acc": 54.62, "cmmlu-business_ethics_5shot_acc": 55, "cmmlu-college_biology_5shot_acc": 49.31, "cmmlu-college_physics_5shot_acc": 36.27, "cmmlu-human_sexuality_5shot_acc": 58.78, "cmmlu-moral_scenarios_5shot_acc": 28.27, "cmmlu-world_religions_5shot_acc": 60.23, "cmmlu-abstract_algebra_5shot_acc": 34, "cmmlu-college_medicine_5shot_acc": 54.91, "cmmlu-machine_learning_5shot_acc": 39.29, "cmmlu-medical_genetics_5shot_acc": 52, "cmmlu-professional_law_5shot_acc": 37.48, "cmmlu-public_relations_5shot_acc": 63.64, "cmmlu-security_studies_5shot_acc": 67.35, "cmmlu-college_chemistry_5shot_acc": 37, "cmmlu-computer_security_5shot_acc": 66, "cmmlu-international_law_5shot_acc": 73.55, "cmmlu-logical_fallacies_5shot_acc": 53.99, "cmmlu-us_foreign_policy_5shot_acc": 77, "cmmlu-clinical_knowledge_5shot_acc": 59.25, "cmmlu-conceptual_physics_5shot_acc": 46.38, "cmmlu-college_mathematics_5shot_acc": 39, "cmmlu-high_school_biology_5shot_acc": 60, "cmmlu-high_school_physics_5shot_acc": 35.76, "cmmlu-high_school_chemistry_5shot_acc": 44.83, "cmmlu-high_school_geography_5shot_acc": 67.68, "cmmlu-professional_medicine_5shot_acc": 45.96, "cmmlu-electrical_engineering_5shot_acc": 51.72, "cmmlu-elementary_mathematics_5shot_acc": 42.59, "cmmlu-high_school_psychology_5shot_acc": 65.32, "cmmlu-high_school_statistics_5shot_acc": 45.83, "cmmlu-high_school_us_history_5shot_acc": 71.57, "cmmlu-high_school_mathematics_5shot_acc": 37.04, "cmmlu-professional_accounting_5shot_acc": 39.01, "cmmlu-professional_psychology_5shot_acc": 52.45, "cmmlu-college_computer_science_5shot_acc": 48, "cmmlu-high_school_world_history_5shot_acc": 72.57, "cmmlu-high_school_macroeconomics_5shot_acc": 55.38, "cmmlu-high_school_microeconomics_5shot_acc": 53.78, "cmmlu-high_school_computer_science_5shot_acc": 63, "cmmlu-high_school_european_history_5shot_acc": 71.52, "cmmlu-high_school_government_and_politics_5shot_acc": 67.36, "cmmlu-anatomy_5shot_acc_norm": 47.41, "cmmlu_fullavg_5shot_acc_norm": 53.81, "cmmlu-virology_5shot_acc_norm": 43.98, "cmmlu-astronomy_5shot_acc_norm": 59.87, "cmmlu-marketing_5shot_acc_norm": 76.5, "cmmlu-nutrition_5shot_acc_norm": 59.48, "cmmlu-sociology_5shot_acc_norm": 72.64, "cmmlu-management_5shot_acc_norm": 66.02, "cmmlu-philosophy_5shot_acc_norm": 53.7, "cmmlu-prehistory_5shot_acc_norm": 54.32, "cmmlu-human_aging_5shot_acc_norm": 52.47, "cmmlu-econometrics_5shot_acc_norm": 40.35, "cmmlu-formal_logic_5shot_acc_norm": 45.6, "cmmlu-global_facts_5shot_acc_norm": 41, "cmmlu-jurisprudence_5shot_acc_norm": 59.26, "cmmlu-miscellaneous_5shot_acc_norm": 59.9, "cmmlu-moral_disputes_5shot_acc_norm": 54.62, "cmmlu-business_ethics_5shot_acc_norm": 55, "cmmlu-college_biology_5shot_acc_norm": 49.31, "cmmlu-college_physics_5shot_acc_norm": 36.27, "cmmlu-human_sexuality_5shot_acc_norm": 58.78, "cmmlu-moral_scenarios_5shot_acc_norm": 28.27, "cmmlu-world_religions_5shot_acc_norm": 60.23, "cmmlu-abstract_algebra_5shot_acc_norm": 34, "cmmlu-college_medicine_5shot_acc_norm": 54.91, "cmmlu-machine_learning_5shot_acc_norm": 39.29, "cmmlu-medical_genetics_5shot_acc_norm": 52, "cmmlu-professional_law_5shot_acc_norm": 37.48, "cmmlu-public_relations_5shot_acc_norm": 63.64, "cmmlu-security_studies_5shot_acc_norm": 67.35, "cmmlu-college_chemistry_5shot_acc_norm": 37, "cmmlu-computer_security_5shot_acc_norm": 66, "cmmlu-international_law_5shot_acc_norm": 73.55, "cmmlu-logical_fallacies_5shot_acc_norm": 53.99, "cmmlu-us_foreign_policy_5shot_acc_norm": 77, "cmmlu-clinical_knowledge_5shot_acc_norm": 59.25, "cmmlu-conceptual_physics_5shot_acc_norm": 46.38, "cmmlu-college_mathematics_5shot_acc_norm": 39, "cmmlu-high_school_biology_5shot_acc_norm": 60, "cmmlu-high_school_physics_5shot_acc_norm": 35.76, "cmmlu-high_school_chemistry_5shot_acc_norm": 44.83, "cmmlu-high_school_geography_5shot_acc_norm": 67.68, "cmmlu-professional_medicine_5shot_acc_norm": 45.96, "cmmlu-electrical_engineering_5shot_acc_norm": 51.72, "cmmlu-elementary_mathematics_5shot_acc_norm": 42.59, "cmmlu-high_school_psychology_5shot_acc_norm": 65.32, "cmmlu-high_school_statistics_5shot_acc_norm": 45.83, "cmmlu-high_school_us_history_5shot_acc_norm": 71.57, "cmmlu-high_school_mathematics_5shot_acc_norm": 37.04, "cmmlu-professional_accounting_5shot_acc_norm": 39.01, "cmmlu-professional_psychology_5shot_acc_norm": 52.45, "cmmlu-college_computer_science_5shot_acc_norm": 48, "cmmlu-high_school_world_history_5shot_acc_norm": 72.57, "cmmlu-high_school_macroeconomics_5shot_acc_norm": 55.38, "cmmlu-high_school_microeconomics_5shot_acc_norm": 53.78, "cmmlu-high_school_computer_science_5shot_acc_norm": 63, "cmmlu-high_school_european_history_5shot_acc_norm": 71.52, "cmmlu-high_school_government_and_politics_5shot_acc_norm": 67.36 } }
{}
{}
{}
{}
{ "model_name": "BAAI/Infinity-Instruct-3M-0625-Yi-1.5-9B", "model_dtype": "float16", "model_size": 0 }
{ "harness-c_arc_challenge": { "acc_norm": 56.66, "acc_stderr": 0, "c_arc_challenge_25shot_acc": 52.47, "c_arc_challenge_25shot_acc_norm": 56.66 }, "harness-c_gsm8k": { "acc": 62.02, "acc_stderr": 0, "c_gsm8k_5shot_acc": 62.02 }, "harness-c_hellaswag": { "acc_norm": 64.06, "acc_stderr": 0, "c_hellaswag_10shot_acc": 46.93, "c_hellaswag_10shot_acc_norm": 64.06 }, "harness-c-sem-v2": { "acc": 86.0075, "acc_stderr": 0, "c_sem_v2-LLSRC_5shot_acc": 88.78, "c_sem_v2-SLPWC_5shot_acc": 80.29, "c_sem_v2-SLRFC_5shot_acc": 90.36, "c_sem_v2-SLSRC_5shot_acc": 84.6, "c_sem_v2-LLSRC_5shot_acc_norm": 88.78, "c_sem_v2-SLPWC_5shot_acc_norm": 80.29, "c_sem_v2-SLRFC_5shot_acc_norm": 90.36, "c_sem_v2-SLSRC_5shot_acc_norm": 84.6 }, "harness-c_truthfulqa_mc": { "mc2": 53.16, "acc_stderr": 0, "c_truthfulqa_mc_0shot_mc1": 30.97, "c_truthfulqa_mc_0shot_mc2": 53.16 }, "harness-c_winogrande": { "acc": 65.59, "acc_stderr": 0, "c_winogrande_0shot_acc": 65.59 }, "CLCC-H": { "acc": 0.7452, "acc_stderr": 0 }, "harness-cmmlu": { "acc_norm": 63.12, "acc_stderr": 0, "cmmlu-anatomy_5shot_acc": 50.37, "cmmlu_fullavg_5shot_acc": 63.12, "cmmlu-virology_5shot_acc": 45.18, "cmmlu-astronomy_5shot_acc": 71.71, "cmmlu-marketing_5shot_acc": 82.91, "cmmlu-nutrition_5shot_acc": 69.61, "cmmlu-sociology_5shot_acc": 79.1, "cmmlu-management_5shot_acc": 76.7, "cmmlu-philosophy_5shot_acc": 65.59, "cmmlu-prehistory_5shot_acc": 64.2, "cmmlu-human_aging_5shot_acc": 66.37, "cmmlu-econometrics_5shot_acc": 51.75, "cmmlu-formal_logic_5shot_acc": 49.6, "cmmlu-global_facts_5shot_acc": 40, "cmmlu-jurisprudence_5shot_acc": 76.85, "cmmlu-miscellaneous_5shot_acc": 71.26, "cmmlu-moral_disputes_5shot_acc": 67.92, "cmmlu-business_ethics_5shot_acc": 69, "cmmlu-college_biology_5shot_acc": 63.19, "cmmlu-college_physics_5shot_acc": 47.06, "cmmlu-human_sexuality_5shot_acc": 67.18, "cmmlu-moral_scenarios_5shot_acc": 29.83, "cmmlu-world_religions_5shot_acc": 64.33, "cmmlu-abstract_algebra_5shot_acc": 29, "cmmlu-college_medicine_5shot_acc": 63.58, "cmmlu-machine_learning_5shot_acc": 49.11, "cmmlu-medical_genetics_5shot_acc": 60, "cmmlu-professional_law_5shot_acc": 45.44, "cmmlu-public_relations_5shot_acc": 63.64, "cmmlu-security_studies_5shot_acc": 71.02, "cmmlu-college_chemistry_5shot_acc": 52, "cmmlu-computer_security_5shot_acc": 71, "cmmlu-international_law_5shot_acc": 78.51, "cmmlu-logical_fallacies_5shot_acc": 66.87, "cmmlu-us_foreign_policy_5shot_acc": 85, "cmmlu-clinical_knowledge_5shot_acc": 65.66, "cmmlu-conceptual_physics_5shot_acc": 64.26, "cmmlu-college_mathematics_5shot_acc": 46, "cmmlu-high_school_biology_5shot_acc": 75.48, "cmmlu-high_school_physics_5shot_acc": 44.37, "cmmlu-high_school_chemistry_5shot_acc": 54.19, "cmmlu-high_school_geography_5shot_acc": 83.33, "cmmlu-professional_medicine_5shot_acc": 58.46, "cmmlu-electrical_engineering_5shot_acc": 63.45, "cmmlu-elementary_mathematics_5shot_acc": 55.56, "cmmlu-high_school_psychology_5shot_acc": 78.53, "cmmlu-high_school_statistics_5shot_acc": 59.26, "cmmlu-high_school_us_history_5shot_acc": 80.39, "cmmlu-high_school_mathematics_5shot_acc": 42.96, "cmmlu-professional_accounting_5shot_acc": 47.87, "cmmlu-professional_psychology_5shot_acc": 60.95, "cmmlu-college_computer_science_5shot_acc": 51, "cmmlu-high_school_world_history_5shot_acc": 81.86, "cmmlu-high_school_macroeconomics_5shot_acc": 70.26, "cmmlu-high_school_microeconomics_5shot_acc": 73.95, "cmmlu-high_school_computer_science_5shot_acc": 74, "cmmlu-high_school_european_history_5shot_acc": 79.39, "cmmlu-high_school_government_and_politics_5shot_acc": 81.87, "cmmlu-anatomy_5shot_acc_norm": 50.37, "cmmlu_fullavg_5shot_acc_norm": 63.12, "cmmlu-virology_5shot_acc_norm": 45.18, "cmmlu-astronomy_5shot_acc_norm": 71.71, "cmmlu-marketing_5shot_acc_norm": 82.91, "cmmlu-nutrition_5shot_acc_norm": 69.61, "cmmlu-sociology_5shot_acc_norm": 79.1, "cmmlu-management_5shot_acc_norm": 76.7, "cmmlu-philosophy_5shot_acc_norm": 65.59, "cmmlu-prehistory_5shot_acc_norm": 64.2, "cmmlu-human_aging_5shot_acc_norm": 66.37, "cmmlu-econometrics_5shot_acc_norm": 51.75, "cmmlu-formal_logic_5shot_acc_norm": 49.6, "cmmlu-global_facts_5shot_acc_norm": 40, "cmmlu-jurisprudence_5shot_acc_norm": 76.85, "cmmlu-miscellaneous_5shot_acc_norm": 71.26, "cmmlu-moral_disputes_5shot_acc_norm": 67.92, "cmmlu-business_ethics_5shot_acc_norm": 69, "cmmlu-college_biology_5shot_acc_norm": 63.19, "cmmlu-college_physics_5shot_acc_norm": 47.06, "cmmlu-human_sexuality_5shot_acc_norm": 67.18, "cmmlu-moral_scenarios_5shot_acc_norm": 29.83, "cmmlu-world_religions_5shot_acc_norm": 64.33, "cmmlu-abstract_algebra_5shot_acc_norm": 29, "cmmlu-college_medicine_5shot_acc_norm": 63.58, "cmmlu-machine_learning_5shot_acc_norm": 49.11, "cmmlu-medical_genetics_5shot_acc_norm": 60, "cmmlu-professional_law_5shot_acc_norm": 45.44, "cmmlu-public_relations_5shot_acc_norm": 63.64, "cmmlu-security_studies_5shot_acc_norm": 71.02, "cmmlu-college_chemistry_5shot_acc_norm": 52, "cmmlu-computer_security_5shot_acc_norm": 71, "cmmlu-international_law_5shot_acc_norm": 78.51, "cmmlu-logical_fallacies_5shot_acc_norm": 66.87, "cmmlu-us_foreign_policy_5shot_acc_norm": 85, "cmmlu-clinical_knowledge_5shot_acc_norm": 65.66, "cmmlu-conceptual_physics_5shot_acc_norm": 64.26, "cmmlu-college_mathematics_5shot_acc_norm": 46, "cmmlu-high_school_biology_5shot_acc_norm": 75.48, "cmmlu-high_school_physics_5shot_acc_norm": 44.37, "cmmlu-high_school_chemistry_5shot_acc_norm": 54.19, "cmmlu-high_school_geography_5shot_acc_norm": 83.33, "cmmlu-professional_medicine_5shot_acc_norm": 58.46, "cmmlu-electrical_engineering_5shot_acc_norm": 63.45, "cmmlu-elementary_mathematics_5shot_acc_norm": 55.56, "cmmlu-high_school_psychology_5shot_acc_norm": 78.53, "cmmlu-high_school_statistics_5shot_acc_norm": 59.26, "cmmlu-high_school_us_history_5shot_acc_norm": 80.39, "cmmlu-high_school_mathematics_5shot_acc_norm": 42.96, "cmmlu-professional_accounting_5shot_acc_norm": 47.87, "cmmlu-professional_psychology_5shot_acc_norm": 60.95, "cmmlu-college_computer_science_5shot_acc_norm": 51, "cmmlu-high_school_world_history_5shot_acc_norm": 81.86, "cmmlu-high_school_macroeconomics_5shot_acc_norm": 70.26, "cmmlu-high_school_microeconomics_5shot_acc_norm": 73.95, "cmmlu-high_school_computer_science_5shot_acc_norm": 74, "cmmlu-high_school_european_history_5shot_acc_norm": 79.39, "cmmlu-high_school_government_and_politics_5shot_acc_norm": 81.87 } }
{}
{}
{}
{}
{ "model_name": "BAAI/Infinity-Instruct-7M-0729-Llama3_1-8B", "model_dtype": "float16", "model_size": 0 }
{ "harness-c_arc_challenge": { "acc_norm": 48.98, "acc_stderr": 0, "c_arc_challenge_25shot_acc": 43, "c_arc_challenge_25shot_acc_norm": 48.98 }, "harness-c_gsm8k": { "acc": 55.34, "acc_stderr": 0, "c_gsm8k_5shot_acc": 55.34 }, "harness-c_hellaswag": { "acc_norm": 59.53, "acc_stderr": 0, "c_hellaswag_10shot_acc": 44.1, "c_hellaswag_10shot_acc_norm": 59.53 }, "harness-c-sem-v2": { "acc": 73.1225, "acc_stderr": 0, "c_sem_v2-LLSRC_5shot_acc": 75.54, "c_sem_v2-SLPWC_5shot_acc": 64.43, "c_sem_v2-SLRFC_5shot_acc": 71.37, "c_sem_v2-SLSRC_5shot_acc": 81.15, "c_sem_v2-LLSRC_5shot_acc_norm": 75.54, "c_sem_v2-SLPWC_5shot_acc_norm": 64.43, "c_sem_v2-SLRFC_5shot_acc_norm": 71.37, "c_sem_v2-SLSRC_5shot_acc_norm": 81.15 }, "harness-c_truthfulqa_mc": { "mc2": 53.53, "acc_stderr": 0, "c_truthfulqa_mc_0shot_mc1": 29.99, "c_truthfulqa_mc_0shot_mc2": 53.53 }, "harness-c_winogrande": { "acc": 64.64, "acc_stderr": 0, "c_winogrande_0shot_acc": 64.64 }, "CLCC-H": { "acc": 0.7627, "acc_stderr": 0 }, "harness-cmmlu": { "acc_norm": 53.82, "acc_stderr": 0, "cmmlu-anatomy_5shot_acc": 43.7, "cmmlu_fullavg_5shot_acc": 53.82, "cmmlu-virology_5shot_acc": 44.58, "cmmlu-astronomy_5shot_acc": 61.84, "cmmlu-marketing_5shot_acc": 72.65, "cmmlu-nutrition_5shot_acc": 59.48, "cmmlu-sociology_5shot_acc": 72.14, "cmmlu-management_5shot_acc": 68.93, "cmmlu-philosophy_5shot_acc": 56.27, "cmmlu-prehistory_5shot_acc": 53.4, "cmmlu-human_aging_5shot_acc": 56.05, "cmmlu-econometrics_5shot_acc": 39.47, "cmmlu-formal_logic_5shot_acc": 42.4, "cmmlu-global_facts_5shot_acc": 42, "cmmlu-jurisprudence_5shot_acc": 59.26, "cmmlu-miscellaneous_5shot_acc": 61.17, "cmmlu-moral_disputes_5shot_acc": 56.65, "cmmlu-business_ethics_5shot_acc": 54, "cmmlu-college_biology_5shot_acc": 52.78, "cmmlu-college_physics_5shot_acc": 29.41, "cmmlu-human_sexuality_5shot_acc": 60.31, "cmmlu-moral_scenarios_5shot_acc": 38.44, "cmmlu-world_religions_5shot_acc": 61.4, "cmmlu-abstract_algebra_5shot_acc": 31, "cmmlu-college_medicine_5shot_acc": 50.29, "cmmlu-machine_learning_5shot_acc": 51.79, "cmmlu-medical_genetics_5shot_acc": 54, "cmmlu-professional_law_5shot_acc": 40.03, "cmmlu-public_relations_5shot_acc": 59.09, "cmmlu-security_studies_5shot_acc": 69.39, "cmmlu-college_chemistry_5shot_acc": 42, "cmmlu-computer_security_5shot_acc": 70, "cmmlu-international_law_5shot_acc": 69.42, "cmmlu-logical_fallacies_5shot_acc": 45.4, "cmmlu-us_foreign_policy_5shot_acc": 70, "cmmlu-clinical_knowledge_5shot_acc": 58.11, "cmmlu-conceptual_physics_5shot_acc": 48.09, "cmmlu-college_mathematics_5shot_acc": 35, "cmmlu-high_school_biology_5shot_acc": 59.68, "cmmlu-high_school_physics_5shot_acc": 37.09, "cmmlu-high_school_chemistry_5shot_acc": 42.36, "cmmlu-high_school_geography_5shot_acc": 65.15, "cmmlu-professional_medicine_5shot_acc": 48.16, "cmmlu-electrical_engineering_5shot_acc": 56.55, "cmmlu-elementary_mathematics_5shot_acc": 42.06, "cmmlu-high_school_psychology_5shot_acc": 64.4, "cmmlu-high_school_statistics_5shot_acc": 44.44, "cmmlu-high_school_us_history_5shot_acc": 68.63, "cmmlu-high_school_mathematics_5shot_acc": 32.59, "cmmlu-professional_accounting_5shot_acc": 39.36, "cmmlu-professional_psychology_5shot_acc": 52.29, "cmmlu-college_computer_science_5shot_acc": 50, "cmmlu-high_school_world_history_5shot_acc": 73, "cmmlu-high_school_macroeconomics_5shot_acc": 56.15, "cmmlu-high_school_microeconomics_5shot_acc": 57.56, "cmmlu-high_school_computer_science_5shot_acc": 65, "cmmlu-high_school_european_history_5shot_acc": 67.88, "cmmlu-high_school_government_and_politics_5shot_acc": 65.28, "cmmlu-anatomy_5shot_acc_norm": 43.7, "cmmlu_fullavg_5shot_acc_norm": 53.82, "cmmlu-virology_5shot_acc_norm": 44.58, "cmmlu-astronomy_5shot_acc_norm": 61.84, "cmmlu-marketing_5shot_acc_norm": 72.65, "cmmlu-nutrition_5shot_acc_norm": 59.48, "cmmlu-sociology_5shot_acc_norm": 72.14, "cmmlu-management_5shot_acc_norm": 68.93, "cmmlu-philosophy_5shot_acc_norm": 56.27, "cmmlu-prehistory_5shot_acc_norm": 53.4, "cmmlu-human_aging_5shot_acc_norm": 56.05, "cmmlu-econometrics_5shot_acc_norm": 39.47, "cmmlu-formal_logic_5shot_acc_norm": 42.4, "cmmlu-global_facts_5shot_acc_norm": 42, "cmmlu-jurisprudence_5shot_acc_norm": 59.26, "cmmlu-miscellaneous_5shot_acc_norm": 61.17, "cmmlu-moral_disputes_5shot_acc_norm": 56.65, "cmmlu-business_ethics_5shot_acc_norm": 54, "cmmlu-college_biology_5shot_acc_norm": 52.78, "cmmlu-college_physics_5shot_acc_norm": 29.41, "cmmlu-human_sexuality_5shot_acc_norm": 60.31, "cmmlu-moral_scenarios_5shot_acc_norm": 38.44, "cmmlu-world_religions_5shot_acc_norm": 61.4, "cmmlu-abstract_algebra_5shot_acc_norm": 31, "cmmlu-college_medicine_5shot_acc_norm": 50.29, "cmmlu-machine_learning_5shot_acc_norm": 51.79, "cmmlu-medical_genetics_5shot_acc_norm": 54, "cmmlu-professional_law_5shot_acc_norm": 40.03, "cmmlu-public_relations_5shot_acc_norm": 59.09, "cmmlu-security_studies_5shot_acc_norm": 69.39, "cmmlu-college_chemistry_5shot_acc_norm": 42, "cmmlu-computer_security_5shot_acc_norm": 70, "cmmlu-international_law_5shot_acc_norm": 69.42, "cmmlu-logical_fallacies_5shot_acc_norm": 45.4, "cmmlu-us_foreign_policy_5shot_acc_norm": 70, "cmmlu-clinical_knowledge_5shot_acc_norm": 58.11, "cmmlu-conceptual_physics_5shot_acc_norm": 48.09, "cmmlu-college_mathematics_5shot_acc_norm": 35, "cmmlu-high_school_biology_5shot_acc_norm": 59.68, "cmmlu-high_school_physics_5shot_acc_norm": 37.09, "cmmlu-high_school_chemistry_5shot_acc_norm": 42.36, "cmmlu-high_school_geography_5shot_acc_norm": 65.15, "cmmlu-professional_medicine_5shot_acc_norm": 48.16, "cmmlu-electrical_engineering_5shot_acc_norm": 56.55, "cmmlu-elementary_mathematics_5shot_acc_norm": 42.06, "cmmlu-high_school_psychology_5shot_acc_norm": 64.4, "cmmlu-high_school_statistics_5shot_acc_norm": 44.44, "cmmlu-high_school_us_history_5shot_acc_norm": 68.63, "cmmlu-high_school_mathematics_5shot_acc_norm": 32.59, "cmmlu-professional_accounting_5shot_acc_norm": 39.36, "cmmlu-professional_psychology_5shot_acc_norm": 52.29, "cmmlu-college_computer_science_5shot_acc_norm": 50, "cmmlu-high_school_world_history_5shot_acc_norm": 73, "cmmlu-high_school_macroeconomics_5shot_acc_norm": 56.15, "cmmlu-high_school_microeconomics_5shot_acc_norm": 57.56, "cmmlu-high_school_computer_science_5shot_acc_norm": 65, "cmmlu-high_school_european_history_5shot_acc_norm": 67.88, "cmmlu-high_school_government_and_politics_5shot_acc_norm": 65.28 } }
{}
{}
{}
{}
{ "model_name": "CausalLM/34b-beta", "model_dtype": "float16", "model_size": 0 }
{ "harness-c_arc_challenge": { "acc_norm": 59.22, "acc_stderr": 0, "c_arc_challenge_25shot_acc": 54.27, "c_arc_challenge_25shot_acc_norm": 59.22 }, "harness-c_gsm8k": { "acc": 47.99, "acc_stderr": 0, "c_gsm8k_5shot_acc": 47.99 }, "harness-c_hellaswag": { "acc_norm": 66.42, "acc_stderr": 0, "c_hellaswag_10shot_acc": 48.42, "c_hellaswag_10shot_acc_norm": 66.42 }, "harness-c-sem-v2": { "acc": 87.625, "acc_stderr": 0, "c_sem_v2-LLSRC_5shot_acc": 91.8, "c_sem_v2-SLPWC_5shot_acc": 80.57, "c_sem_v2-SLRFC_5shot_acc": 91.51, "c_sem_v2-SLSRC_5shot_acc": 86.62, "c_sem_v2-LLSRC_5shot_acc_norm": 91.8, "c_sem_v2-SLPWC_5shot_acc_norm": 80.57, "c_sem_v2-SLRFC_5shot_acc_norm": 91.51, "c_sem_v2-SLSRC_5shot_acc_norm": 86.62 }, "harness-c_truthfulqa_mc": { "mc2": 52.78, "acc_stderr": 0, "c_truthfulqa_mc_0shot_mc1": 30.35, "c_truthfulqa_mc_0shot_mc2": 52.78 }, "harness-c_winogrande": { "acc": 65.75, "acc_stderr": 0, "c_winogrande_0shot_acc": 65.75 }, "CLCC-H": { "acc": 0.6608, "acc_stderr": 0 }, "harness-cmmlu": { "acc_norm": 71.71, "acc_stderr": 0, "cmmlu-anatomy_5shot_acc": 61.48, "cmmlu_fullavg_5shot_acc": 71.71, "cmmlu-virology_5shot_acc": 54.82, "cmmlu-astronomy_5shot_acc": 76.97, "cmmlu-marketing_5shot_acc": 84.19, "cmmlu-nutrition_5shot_acc": 82.68, "cmmlu-sociology_5shot_acc": 86.07, "cmmlu-management_5shot_acc": 82.52, "cmmlu-philosophy_5shot_acc": 76.85, "cmmlu-prehistory_5shot_acc": 75.62, "cmmlu-human_aging_5shot_acc": 73.54, "cmmlu-econometrics_5shot_acc": 48.25, "cmmlu-formal_logic_5shot_acc": 42.4, "cmmlu-global_facts_5shot_acc": 64, "cmmlu-jurisprudence_5shot_acc": 79.63, "cmmlu-miscellaneous_5shot_acc": 80.59, "cmmlu-moral_disputes_5shot_acc": 70.81, "cmmlu-business_ethics_5shot_acc": 70, "cmmlu-college_biology_5shot_acc": 70.14, "cmmlu-college_physics_5shot_acc": 60.78, "cmmlu-human_sexuality_5shot_acc": 74.05, "cmmlu-moral_scenarios_5shot_acc": 66.26, "cmmlu-world_religions_5shot_acc": 74.27, "cmmlu-abstract_algebra_5shot_acc": 44, "cmmlu-college_medicine_5shot_acc": 72.25, "cmmlu-machine_learning_5shot_acc": 61.61, "cmmlu-medical_genetics_5shot_acc": 77, "cmmlu-professional_law_5shot_acc": 55.54, "cmmlu-public_relations_5shot_acc": 72.73, "cmmlu-security_studies_5shot_acc": 79.59, "cmmlu-college_chemistry_5shot_acc": 61, "cmmlu-computer_security_5shot_acc": 70, "cmmlu-international_law_5shot_acc": 85.95, "cmmlu-logical_fallacies_5shot_acc": 65.64, "cmmlu-us_foreign_policy_5shot_acc": 91, "cmmlu-clinical_knowledge_5shot_acc": 75.09, "cmmlu-conceptual_physics_5shot_acc": 69.79, "cmmlu-college_mathematics_5shot_acc": 52, "cmmlu-high_school_biology_5shot_acc": 80.65, "cmmlu-high_school_physics_5shot_acc": 50.99, "cmmlu-high_school_chemistry_5shot_acc": 68.47, "cmmlu-high_school_geography_5shot_acc": 79.8, "cmmlu-professional_medicine_5shot_acc": 77.94, "cmmlu-electrical_engineering_5shot_acc": 67.59, "cmmlu-elementary_mathematics_5shot_acc": 73.81, "cmmlu-high_school_psychology_5shot_acc": 85.32, "cmmlu-high_school_statistics_5shot_acc": 68.98, "cmmlu-high_school_us_history_5shot_acc": 87.75, "cmmlu-high_school_mathematics_5shot_acc": 56.3, "cmmlu-professional_accounting_5shot_acc": 62.06, "cmmlu-professional_psychology_5shot_acc": 70.26, "cmmlu-college_computer_science_5shot_acc": 62, "cmmlu-high_school_world_history_5shot_acc": 86.08, "cmmlu-high_school_macroeconomics_5shot_acc": 80.51, "cmmlu-high_school_microeconomics_5shot_acc": 84.87, "cmmlu-high_school_computer_science_5shot_acc": 86, "cmmlu-high_school_european_history_5shot_acc": 80.61, "cmmlu-high_school_government_and_politics_5shot_acc": 88.6, "cmmlu-anatomy_5shot_acc_norm": 61.48, "cmmlu_fullavg_5shot_acc_norm": 71.71, "cmmlu-virology_5shot_acc_norm": 54.82, "cmmlu-astronomy_5shot_acc_norm": 76.97, "cmmlu-marketing_5shot_acc_norm": 84.19, "cmmlu-nutrition_5shot_acc_norm": 82.68, "cmmlu-sociology_5shot_acc_norm": 86.07, "cmmlu-management_5shot_acc_norm": 82.52, "cmmlu-philosophy_5shot_acc_norm": 76.85, "cmmlu-prehistory_5shot_acc_norm": 75.62, "cmmlu-human_aging_5shot_acc_norm": 73.54, "cmmlu-econometrics_5shot_acc_norm": 48.25, "cmmlu-formal_logic_5shot_acc_norm": 42.4, "cmmlu-global_facts_5shot_acc_norm": 64, "cmmlu-jurisprudence_5shot_acc_norm": 79.63, "cmmlu-miscellaneous_5shot_acc_norm": 80.59, "cmmlu-moral_disputes_5shot_acc_norm": 70.81, "cmmlu-business_ethics_5shot_acc_norm": 70, "cmmlu-college_biology_5shot_acc_norm": 70.14, "cmmlu-college_physics_5shot_acc_norm": 60.78, "cmmlu-human_sexuality_5shot_acc_norm": 74.05, "cmmlu-moral_scenarios_5shot_acc_norm": 66.26, "cmmlu-world_religions_5shot_acc_norm": 74.27, "cmmlu-abstract_algebra_5shot_acc_norm": 44, "cmmlu-college_medicine_5shot_acc_norm": 72.25, "cmmlu-machine_learning_5shot_acc_norm": 61.61, "cmmlu-medical_genetics_5shot_acc_norm": 77, "cmmlu-professional_law_5shot_acc_norm": 55.54, "cmmlu-public_relations_5shot_acc_norm": 72.73, "cmmlu-security_studies_5shot_acc_norm": 79.59, "cmmlu-college_chemistry_5shot_acc_norm": 61, "cmmlu-computer_security_5shot_acc_norm": 70, "cmmlu-international_law_5shot_acc_norm": 85.95, "cmmlu-logical_fallacies_5shot_acc_norm": 65.64, "cmmlu-us_foreign_policy_5shot_acc_norm": 91, "cmmlu-clinical_knowledge_5shot_acc_norm": 75.09, "cmmlu-conceptual_physics_5shot_acc_norm": 69.79, "cmmlu-college_mathematics_5shot_acc_norm": 52, "cmmlu-high_school_biology_5shot_acc_norm": 80.65, "cmmlu-high_school_physics_5shot_acc_norm": 50.99, "cmmlu-high_school_chemistry_5shot_acc_norm": 68.47, "cmmlu-high_school_geography_5shot_acc_norm": 79.8, "cmmlu-professional_medicine_5shot_acc_norm": 77.94, "cmmlu-electrical_engineering_5shot_acc_norm": 67.59, "cmmlu-elementary_mathematics_5shot_acc_norm": 73.81, "cmmlu-high_school_psychology_5shot_acc_norm": 85.32, "cmmlu-high_school_statistics_5shot_acc_norm": 68.98, "cmmlu-high_school_us_history_5shot_acc_norm": 87.75, "cmmlu-high_school_mathematics_5shot_acc_norm": 56.3, "cmmlu-professional_accounting_5shot_acc_norm": 62.06, "cmmlu-professional_psychology_5shot_acc_norm": 70.26, "cmmlu-college_computer_science_5shot_acc_norm": 62, "cmmlu-high_school_world_history_5shot_acc_norm": 86.08, "cmmlu-high_school_macroeconomics_5shot_acc_norm": 80.51, "cmmlu-high_school_microeconomics_5shot_acc_norm": 84.87, "cmmlu-high_school_computer_science_5shot_acc_norm": 86, "cmmlu-high_school_european_history_5shot_acc_norm": 80.61, "cmmlu-high_school_government_and_politics_5shot_acc_norm": 88.6 } }
{}
{}
{}
{}
{ "model_name": "CofeAI/Tele-FLM", "model_dtype": "float16", "model_size": 0 }
{ "harness-c_arc_challenge": { "acc_norm": 56.57, "acc_stderr": 0, "c_arc_challenge_25shot_acc": 52.22, "c_arc_challenge_25shot_acc_norm": 56.57 }, "harness-c_gsm8k": { "acc": 30.02, "acc_stderr": 0, "c_gsm8k_5shot_acc": 30.02 }, "harness-c_hellaswag": { "acc_norm": 67.71, "acc_stderr": 0, "c_hellaswag_10shot_acc": 49.82, "c_hellaswag_10shot_acc_norm": 67.71 }, "harness-c-sem-v2": { "acc": 84.3575, "acc_stderr": 0, "c_sem_v2-LLSRC_5shot_acc": 89.93, "c_sem_v2-SLPWC_5shot_acc": 76.57, "c_sem_v2-SLRFC_5shot_acc": 87.48, "c_sem_v2-SLSRC_5shot_acc": 83.45, "c_sem_v2-LLSRC_5shot_acc_norm": 89.93, "c_sem_v2-SLPWC_5shot_acc_norm": 76.57, "c_sem_v2-SLRFC_5shot_acc_norm": 87.48, "c_sem_v2-SLSRC_5shot_acc_norm": 83.45 }, "harness-c_truthfulqa_mc": { "mc2": 51.07, "acc_stderr": 0, "c_truthfulqa_mc_0shot_mc1": 28.64, "c_truthfulqa_mc_0shot_mc2": 51.07 }, "harness-c_winogrande": { "acc": 67.56, "acc_stderr": 0, "c_winogrande_0shot_acc": 67.56 }, "harness-cmmlu": { "acc_norm": 59.87, "acc_stderr": 0, "cmmlu-anatomy_5shot_acc": 48.15, "cmmlu_fullavg_5shot_acc": 59.87, "cmmlu-virology_5shot_acc": 46.99, "cmmlu-astronomy_5shot_acc": 61.84, "cmmlu-marketing_5shot_acc": 80.34, "cmmlu-nutrition_5shot_acc": 65.36, "cmmlu-sociology_5shot_acc": 83.08, "cmmlu-management_5shot_acc": 67.96, "cmmlu-philosophy_5shot_acc": 67.2, "cmmlu-prehistory_5shot_acc": 65.43, "cmmlu-human_aging_5shot_acc": 64.57, "cmmlu-econometrics_5shot_acc": 39.47, "cmmlu-formal_logic_5shot_acc": 36.8, "cmmlu-global_facts_5shot_acc": 44, "cmmlu-jurisprudence_5shot_acc": 75.93, "cmmlu-miscellaneous_5shot_acc": 67.82, "cmmlu-moral_disputes_5shot_acc": 66.18, "cmmlu-business_ethics_5shot_acc": 68, "cmmlu-college_biology_5shot_acc": 53.47, "cmmlu-college_physics_5shot_acc": 35.29, "cmmlu-human_sexuality_5shot_acc": 64.89, "cmmlu-moral_scenarios_5shot_acc": 46.37, "cmmlu-world_religions_5shot_acc": 69.01, "cmmlu-abstract_algebra_5shot_acc": 41, "cmmlu-college_medicine_5shot_acc": 53.76, "cmmlu-machine_learning_5shot_acc": 40.18, "cmmlu-medical_genetics_5shot_acc": 56, "cmmlu-professional_law_5shot_acc": 48.83, "cmmlu-public_relations_5shot_acc": 63.64, "cmmlu-security_studies_5shot_acc": 69.39, "cmmlu-college_chemistry_5shot_acc": 42, "cmmlu-computer_security_5shot_acc": 64, "cmmlu-international_law_5shot_acc": 67.77, "cmmlu-logical_fallacies_5shot_acc": 67.48, "cmmlu-us_foreign_policy_5shot_acc": 78, "cmmlu-clinical_knowledge_5shot_acc": 56.23, "cmmlu-conceptual_physics_5shot_acc": 49.36, "cmmlu-college_mathematics_5shot_acc": 38, "cmmlu-high_school_biology_5shot_acc": 68.06, "cmmlu-high_school_physics_5shot_acc": 39.07, "cmmlu-high_school_chemistry_5shot_acc": 51.23, "cmmlu-high_school_geography_5shot_acc": 80.3, "cmmlu-professional_medicine_5shot_acc": 60.66, "cmmlu-electrical_engineering_5shot_acc": 57.93, "cmmlu-elementary_mathematics_5shot_acc": 38.62, "cmmlu-high_school_psychology_5shot_acc": 74.86, "cmmlu-high_school_statistics_5shot_acc": 52.78, "cmmlu-high_school_us_history_5shot_acc": 81.37, "cmmlu-high_school_mathematics_5shot_acc": 35.56, "cmmlu-professional_accounting_5shot_acc": 47.87, "cmmlu-professional_psychology_5shot_acc": 62.58, "cmmlu-college_computer_science_5shot_acc": 49, "cmmlu-high_school_world_history_5shot_acc": 83.54, "cmmlu-high_school_macroeconomics_5shot_acc": 63.59, "cmmlu-high_school_microeconomics_5shot_acc": 69.75, "cmmlu-high_school_computer_science_5shot_acc": 74, "cmmlu-high_school_european_history_5shot_acc": 83.64, "cmmlu-high_school_government_and_politics_5shot_acc": 84.46, "cmmlu-anatomy_5shot_acc_norm": 48.15, "cmmlu_fullavg_5shot_acc_norm": 59.87, "cmmlu-virology_5shot_acc_norm": 46.99, "cmmlu-astronomy_5shot_acc_norm": 61.84, "cmmlu-marketing_5shot_acc_norm": 80.34, "cmmlu-nutrition_5shot_acc_norm": 65.36, "cmmlu-sociology_5shot_acc_norm": 83.08, "cmmlu-management_5shot_acc_norm": 67.96, "cmmlu-philosophy_5shot_acc_norm": 67.2, "cmmlu-prehistory_5shot_acc_norm": 65.43, "cmmlu-human_aging_5shot_acc_norm": 64.57, "cmmlu-econometrics_5shot_acc_norm": 39.47, "cmmlu-formal_logic_5shot_acc_norm": 36.8, "cmmlu-global_facts_5shot_acc_norm": 44, "cmmlu-jurisprudence_5shot_acc_norm": 75.93, "cmmlu-miscellaneous_5shot_acc_norm": 67.82, "cmmlu-moral_disputes_5shot_acc_norm": 66.18, "cmmlu-business_ethics_5shot_acc_norm": 68, "cmmlu-college_biology_5shot_acc_norm": 53.47, "cmmlu-college_physics_5shot_acc_norm": 35.29, "cmmlu-human_sexuality_5shot_acc_norm": 64.89, "cmmlu-moral_scenarios_5shot_acc_norm": 46.37, "cmmlu-world_religions_5shot_acc_norm": 69.01, "cmmlu-abstract_algebra_5shot_acc_norm": 41, "cmmlu-college_medicine_5shot_acc_norm": 53.76, "cmmlu-machine_learning_5shot_acc_norm": 40.18, "cmmlu-medical_genetics_5shot_acc_norm": 56, "cmmlu-professional_law_5shot_acc_norm": 48.83, "cmmlu-public_relations_5shot_acc_norm": 63.64, "cmmlu-security_studies_5shot_acc_norm": 69.39, "cmmlu-college_chemistry_5shot_acc_norm": 42, "cmmlu-computer_security_5shot_acc_norm": 64, "cmmlu-international_law_5shot_acc_norm": 67.77, "cmmlu-logical_fallacies_5shot_acc_norm": 67.48, "cmmlu-us_foreign_policy_5shot_acc_norm": 78, "cmmlu-clinical_knowledge_5shot_acc_norm": 56.23, "cmmlu-conceptual_physics_5shot_acc_norm": 49.36, "cmmlu-college_mathematics_5shot_acc_norm": 38, "cmmlu-high_school_biology_5shot_acc_norm": 68.06, "cmmlu-high_school_physics_5shot_acc_norm": 39.07, "cmmlu-high_school_chemistry_5shot_acc_norm": 51.23, "cmmlu-high_school_geography_5shot_acc_norm": 80.3, "cmmlu-professional_medicine_5shot_acc_norm": 60.66, "cmmlu-electrical_engineering_5shot_acc_norm": 57.93, "cmmlu-elementary_mathematics_5shot_acc_norm": 38.62, "cmmlu-high_school_psychology_5shot_acc_norm": 74.86, "cmmlu-high_school_statistics_5shot_acc_norm": 52.78, "cmmlu-high_school_us_history_5shot_acc_norm": 81.37, "cmmlu-high_school_mathematics_5shot_acc_norm": 35.56, "cmmlu-professional_accounting_5shot_acc_norm": 47.87, "cmmlu-professional_psychology_5shot_acc_norm": 62.58, "cmmlu-college_computer_science_5shot_acc_norm": 49, "cmmlu-high_school_world_history_5shot_acc_norm": 83.54, "cmmlu-high_school_macroeconomics_5shot_acc_norm": 63.59, "cmmlu-high_school_microeconomics_5shot_acc_norm": 69.75, "cmmlu-high_school_computer_science_5shot_acc_norm": 74, "cmmlu-high_school_european_history_5shot_acc_norm": 83.64, "cmmlu-high_school_government_and_politics_5shot_acc_norm": 84.46 } }
{}
{}
{}
{}
{ "model_name": "CombinHorizon/YiSM-blossom5.1-34B-SLERP", "model_dtype": "float16", "model_size": 0 }
{ "harness-c_arc_challenge": { "acc_norm": 62.37, "acc_stderr": 0, "c_arc_challenge_25shot_acc": 55.8, "c_arc_challenge_25shot_acc_norm": 62.37 }, "harness-c_gsm8k": { "acc": 70.13, "acc_stderr": 0, "c_gsm8k_5shot_acc": 70.13 }, "harness-c_hellaswag": { "acc_norm": 69.32, "acc_stderr": 0, "c_hellaswag_10shot_acc": 50.9, "c_hellaswag_10shot_acc_norm": 69.32 }, "harness-c-sem-v2": { "acc": 90.635, "acc_stderr": 0, "c_sem_v2-LLSRC_5shot_acc": 93.96, "c_sem_v2-SLPWC_5shot_acc": 86.86, "c_sem_v2-SLRFC_5shot_acc": 94.53, "c_sem_v2-SLSRC_5shot_acc": 87.19, "c_sem_v2-LLSRC_5shot_acc_norm": 93.96, "c_sem_v2-SLPWC_5shot_acc_norm": 86.86, "c_sem_v2-SLRFC_5shot_acc_norm": 94.53, "c_sem_v2-SLSRC_5shot_acc_norm": 87.19 }, "harness-c_truthfulqa_mc": { "mc2": 56.36, "acc_stderr": 0, "c_truthfulqa_mc_0shot_mc1": 35.37, "c_truthfulqa_mc_0shot_mc2": 56.36 }, "harness-c_winogrande": { "acc": 70.88, "acc_stderr": 0, "c_winogrande_0shot_acc": 70.88 }, "CLCC-H": { "acc": 0.7643, "acc_stderr": 0 }, "harness-cmmlu": { "acc_norm": 70.98, "acc_stderr": 0, "cmmlu-anatomy_5shot_acc": 61.48, "cmmlu_fullavg_5shot_acc": 70.98, "cmmlu-virology_5shot_acc": 54.82, "cmmlu-astronomy_5shot_acc": 81.58, "cmmlu-marketing_5shot_acc": 88.89, "cmmlu-nutrition_5shot_acc": 79.74, "cmmlu-sociology_5shot_acc": 82.09, "cmmlu-management_5shot_acc": 83.5, "cmmlu-philosophy_5shot_acc": 70.42, "cmmlu-prehistory_5shot_acc": 74.69, "cmmlu-human_aging_5shot_acc": 72.65, "cmmlu-econometrics_5shot_acc": 59.65, "cmmlu-formal_logic_5shot_acc": 53.6, "cmmlu-global_facts_5shot_acc": 49, "cmmlu-jurisprudence_5shot_acc": 81.48, "cmmlu-miscellaneous_5shot_acc": 80.72, "cmmlu-moral_disputes_5shot_acc": 73.12, "cmmlu-business_ethics_5shot_acc": 72, "cmmlu-college_biology_5shot_acc": 71.53, "cmmlu-college_physics_5shot_acc": 50.98, "cmmlu-human_sexuality_5shot_acc": 77.1, "cmmlu-moral_scenarios_5shot_acc": 57.65, "cmmlu-world_religions_5shot_acc": 74.27, "cmmlu-abstract_algebra_5shot_acc": 37, "cmmlu-college_medicine_5shot_acc": 75.14, "cmmlu-machine_learning_5shot_acc": 62.5, "cmmlu-medical_genetics_5shot_acc": 72, "cmmlu-professional_law_5shot_acc": 52.8, "cmmlu-public_relations_5shot_acc": 65.45, "cmmlu-security_studies_5shot_acc": 79.18, "cmmlu-college_chemistry_5shot_acc": 53, "cmmlu-computer_security_5shot_acc": 76, "cmmlu-international_law_5shot_acc": 90.08, "cmmlu-logical_fallacies_5shot_acc": 73.62, "cmmlu-us_foreign_policy_5shot_acc": 86, "cmmlu-clinical_knowledge_5shot_acc": 76.98, "cmmlu-conceptual_physics_5shot_acc": 74.47, "cmmlu-college_mathematics_5shot_acc": 41, "cmmlu-high_school_biology_5shot_acc": 83.87, "cmmlu-high_school_physics_5shot_acc": 52.98, "cmmlu-high_school_chemistry_5shot_acc": 59.11, "cmmlu-high_school_geography_5shot_acc": 84.85, "cmmlu-professional_medicine_5shot_acc": 73.9, "cmmlu-electrical_engineering_5shot_acc": 69.66, "cmmlu-elementary_mathematics_5shot_acc": 68.25, "cmmlu-high_school_psychology_5shot_acc": 84.22, "cmmlu-high_school_statistics_5shot_acc": 67.59, "cmmlu-high_school_us_history_5shot_acc": 89.71, "cmmlu-high_school_mathematics_5shot_acc": 52.59, "cmmlu-professional_accounting_5shot_acc": 55.67, "cmmlu-professional_psychology_5shot_acc": 70.26, "cmmlu-college_computer_science_5shot_acc": 63, "cmmlu-high_school_world_history_5shot_acc": 86.92, "cmmlu-high_school_macroeconomics_5shot_acc": 77.95, "cmmlu-high_school_microeconomics_5shot_acc": 84.87, "cmmlu-high_school_computer_science_5shot_acc": 82, "cmmlu-high_school_european_history_5shot_acc": 80, "cmmlu-high_school_government_and_politics_5shot_acc": 92.23, "cmmlu-anatomy_5shot_acc_norm": 61.48, "cmmlu_fullavg_5shot_acc_norm": 70.98, "cmmlu-virology_5shot_acc_norm": 54.82, "cmmlu-astronomy_5shot_acc_norm": 81.58, "cmmlu-marketing_5shot_acc_norm": 88.89, "cmmlu-nutrition_5shot_acc_norm": 79.74, "cmmlu-sociology_5shot_acc_norm": 82.09, "cmmlu-management_5shot_acc_norm": 83.5, "cmmlu-philosophy_5shot_acc_norm": 70.42, "cmmlu-prehistory_5shot_acc_norm": 74.69, "cmmlu-human_aging_5shot_acc_norm": 72.65, "cmmlu-econometrics_5shot_acc_norm": 59.65, "cmmlu-formal_logic_5shot_acc_norm": 53.6, "cmmlu-global_facts_5shot_acc_norm": 49, "cmmlu-jurisprudence_5shot_acc_norm": 81.48, "cmmlu-miscellaneous_5shot_acc_norm": 80.72, "cmmlu-moral_disputes_5shot_acc_norm": 73.12, "cmmlu-business_ethics_5shot_acc_norm": 72, "cmmlu-college_biology_5shot_acc_norm": 71.53, "cmmlu-college_physics_5shot_acc_norm": 50.98, "cmmlu-human_sexuality_5shot_acc_norm": 77.1, "cmmlu-moral_scenarios_5shot_acc_norm": 57.65, "cmmlu-world_religions_5shot_acc_norm": 74.27, "cmmlu-abstract_algebra_5shot_acc_norm": 37, "cmmlu-college_medicine_5shot_acc_norm": 75.14, "cmmlu-machine_learning_5shot_acc_norm": 62.5, "cmmlu-medical_genetics_5shot_acc_norm": 72, "cmmlu-professional_law_5shot_acc_norm": 52.8, "cmmlu-public_relations_5shot_acc_norm": 65.45, "cmmlu-security_studies_5shot_acc_norm": 79.18, "cmmlu-college_chemistry_5shot_acc_norm": 53, "cmmlu-computer_security_5shot_acc_norm": 76, "cmmlu-international_law_5shot_acc_norm": 90.08, "cmmlu-logical_fallacies_5shot_acc_norm": 73.62, "cmmlu-us_foreign_policy_5shot_acc_norm": 86, "cmmlu-clinical_knowledge_5shot_acc_norm": 76.98, "cmmlu-conceptual_physics_5shot_acc_norm": 74.47, "cmmlu-college_mathematics_5shot_acc_norm": 41, "cmmlu-high_school_biology_5shot_acc_norm": 83.87, "cmmlu-high_school_physics_5shot_acc_norm": 52.98, "cmmlu-high_school_chemistry_5shot_acc_norm": 59.11, "cmmlu-high_school_geography_5shot_acc_norm": 84.85, "cmmlu-professional_medicine_5shot_acc_norm": 73.9, "cmmlu-electrical_engineering_5shot_acc_norm": 69.66, "cmmlu-elementary_mathematics_5shot_acc_norm": 68.25, "cmmlu-high_school_psychology_5shot_acc_norm": 84.22, "cmmlu-high_school_statistics_5shot_acc_norm": 67.59, "cmmlu-high_school_us_history_5shot_acc_norm": 89.71, "cmmlu-high_school_mathematics_5shot_acc_norm": 52.59, "cmmlu-professional_accounting_5shot_acc_norm": 55.67, "cmmlu-professional_psychology_5shot_acc_norm": 70.26, "cmmlu-college_computer_science_5shot_acc_norm": 63, "cmmlu-high_school_world_history_5shot_acc_norm": 86.92, "cmmlu-high_school_macroeconomics_5shot_acc_norm": 77.95, "cmmlu-high_school_microeconomics_5shot_acc_norm": 84.87, "cmmlu-high_school_computer_science_5shot_acc_norm": 82, "cmmlu-high_school_european_history_5shot_acc_norm": 80, "cmmlu-high_school_government_and_politics_5shot_acc_norm": 92.23 } }
{}
{}
{}
{}
{ "model_name": "ConvexAI/Luminex-34B-v0.1", "model_dtype": "float16", "model_size": 0 }
{ "harness-c_arc_challenge": { "acc_norm": 65.78, "acc_stderr": 0, "c_arc_challenge_25shot_acc": 61.77, "c_arc_challenge_25shot_acc_norm": 65.78 }, "harness-c_gsm8k": { "acc": 58.76, "acc_stderr": 0, "c_gsm8k_5shot_acc": 58.76 }, "harness-c_hellaswag": { "acc_norm": 70.93, "acc_stderr": 0, "c_hellaswag_10shot_acc": 51.48, "c_hellaswag_10shot_acc_norm": 70.93 }, "harness-c-sem-v2": { "acc": 89.85, "acc_stderr": 0, "c_sem_v2-LLSRC_5shot_acc": 93.67, "c_sem_v2-SLPWC_5shot_acc": 83.57, "c_sem_v2-SLRFC_5shot_acc": 93.53, "c_sem_v2-SLSRC_5shot_acc": 88.63, "c_sem_v2-LLSRC_5shot_acc_norm": 93.67, "c_sem_v2-SLPWC_5shot_acc_norm": 83.57, "c_sem_v2-SLRFC_5shot_acc_norm": 93.53, "c_sem_v2-SLSRC_5shot_acc_norm": 88.63 }, "harness-c_truthfulqa_mc": { "mc2": 58.34, "acc_stderr": 0, "c_truthfulqa_mc_0shot_mc1": 38.92, "c_truthfulqa_mc_0shot_mc2": 58.34 }, "harness-c_winogrande": { "acc": 71.9, "acc_stderr": 0, "c_winogrande_0shot_acc": 71.9 }, "CLCC-H": { "acc": 0.7707, "acc_stderr": 0 }, "harness-cmmlu": { "acc_norm": 69.69, "acc_stderr": 0, "cmmlu-anatomy_5shot_acc": 61.48, "cmmlu_fullavg_5shot_acc": 69.69, "cmmlu-virology_5shot_acc": 54.22, "cmmlu-astronomy_5shot_acc": 82.24, "cmmlu-marketing_5shot_acc": 87.61, "cmmlu-nutrition_5shot_acc": 81.37, "cmmlu-sociology_5shot_acc": 84.58, "cmmlu-management_5shot_acc": 78.64, "cmmlu-philosophy_5shot_acc": 74.92, "cmmlu-prehistory_5shot_acc": 75.31, "cmmlu-human_aging_5shot_acc": 73.09, "cmmlu-econometrics_5shot_acc": 54.39, "cmmlu-formal_logic_5shot_acc": 41.6, "cmmlu-global_facts_5shot_acc": 53, "cmmlu-jurisprudence_5shot_acc": 76.85, "cmmlu-miscellaneous_5shot_acc": 81.23, "cmmlu-moral_disputes_5shot_acc": 74.86, "cmmlu-business_ethics_5shot_acc": 68, "cmmlu-college_biology_5shot_acc": 75, "cmmlu-college_physics_5shot_acc": 47.06, "cmmlu-human_sexuality_5shot_acc": 77.86, "cmmlu-moral_scenarios_5shot_acc": 63.13, "cmmlu-world_religions_5shot_acc": 74.85, "cmmlu-abstract_algebra_5shot_acc": 38, "cmmlu-college_medicine_5shot_acc": 70.52, "cmmlu-machine_learning_5shot_acc": 49.11, "cmmlu-medical_genetics_5shot_acc": 78, "cmmlu-professional_law_5shot_acc": 51.5, "cmmlu-public_relations_5shot_acc": 64.55, "cmmlu-security_studies_5shot_acc": 79.59, "cmmlu-college_chemistry_5shot_acc": 52, "cmmlu-computer_security_5shot_acc": 76, "cmmlu-international_law_5shot_acc": 88.43, "cmmlu-logical_fallacies_5shot_acc": 69.94, "cmmlu-us_foreign_policy_5shot_acc": 86, "cmmlu-clinical_knowledge_5shot_acc": 73.21, "cmmlu-conceptual_physics_5shot_acc": 69.79, "cmmlu-college_mathematics_5shot_acc": 48, "cmmlu-high_school_biology_5shot_acc": 81.29, "cmmlu-high_school_physics_5shot_acc": 46.36, "cmmlu-high_school_chemistry_5shot_acc": 58.62, "cmmlu-high_school_geography_5shot_acc": 83.33, "cmmlu-professional_medicine_5shot_acc": 70.96, "cmmlu-electrical_engineering_5shot_acc": 65.52, "cmmlu-elementary_mathematics_5shot_acc": 69.05, "cmmlu-high_school_psychology_5shot_acc": 84.22, "cmmlu-high_school_statistics_5shot_acc": 63.89, "cmmlu-high_school_us_history_5shot_acc": 86.76, "cmmlu-high_school_mathematics_5shot_acc": 44.44, "cmmlu-professional_accounting_5shot_acc": 57.8, "cmmlu-professional_psychology_5shot_acc": 72.55, "cmmlu-college_computer_science_5shot_acc": 59, "cmmlu-high_school_world_history_5shot_acc": 86.5, "cmmlu-high_school_macroeconomics_5shot_acc": 78.46, "cmmlu-high_school_microeconomics_5shot_acc": 79.41, "cmmlu-high_school_computer_science_5shot_acc": 79, "cmmlu-high_school_european_history_5shot_acc": 78.79, "cmmlu-high_school_government_and_politics_5shot_acc": 90.67, "cmmlu-anatomy_5shot_acc_norm": 61.48, "cmmlu_fullavg_5shot_acc_norm": 69.69, "cmmlu-virology_5shot_acc_norm": 54.22, "cmmlu-astronomy_5shot_acc_norm": 82.24, "cmmlu-marketing_5shot_acc_norm": 87.61, "cmmlu-nutrition_5shot_acc_norm": 81.37, "cmmlu-sociology_5shot_acc_norm": 84.58, "cmmlu-management_5shot_acc_norm": 78.64, "cmmlu-philosophy_5shot_acc_norm": 74.92, "cmmlu-prehistory_5shot_acc_norm": 75.31, "cmmlu-human_aging_5shot_acc_norm": 73.09, "cmmlu-econometrics_5shot_acc_norm": 54.39, "cmmlu-formal_logic_5shot_acc_norm": 41.6, "cmmlu-global_facts_5shot_acc_norm": 53, "cmmlu-jurisprudence_5shot_acc_norm": 76.85, "cmmlu-miscellaneous_5shot_acc_norm": 81.23, "cmmlu-moral_disputes_5shot_acc_norm": 74.86, "cmmlu-business_ethics_5shot_acc_norm": 68, "cmmlu-college_biology_5shot_acc_norm": 75, "cmmlu-college_physics_5shot_acc_norm": 47.06, "cmmlu-human_sexuality_5shot_acc_norm": 77.86, "cmmlu-moral_scenarios_5shot_acc_norm": 63.13, "cmmlu-world_religions_5shot_acc_norm": 74.85, "cmmlu-abstract_algebra_5shot_acc_norm": 38, "cmmlu-college_medicine_5shot_acc_norm": 70.52, "cmmlu-machine_learning_5shot_acc_norm": 49.11, "cmmlu-medical_genetics_5shot_acc_norm": 78, "cmmlu-professional_law_5shot_acc_norm": 51.5, "cmmlu-public_relations_5shot_acc_norm": 64.55, "cmmlu-security_studies_5shot_acc_norm": 79.59, "cmmlu-college_chemistry_5shot_acc_norm": 52, "cmmlu-computer_security_5shot_acc_norm": 76, "cmmlu-international_law_5shot_acc_norm": 88.43, "cmmlu-logical_fallacies_5shot_acc_norm": 69.94, "cmmlu-us_foreign_policy_5shot_acc_norm": 86, "cmmlu-clinical_knowledge_5shot_acc_norm": 73.21, "cmmlu-conceptual_physics_5shot_acc_norm": 69.79, "cmmlu-college_mathematics_5shot_acc_norm": 48, "cmmlu-high_school_biology_5shot_acc_norm": 81.29, "cmmlu-high_school_physics_5shot_acc_norm": 46.36, "cmmlu-high_school_chemistry_5shot_acc_norm": 58.62, "cmmlu-high_school_geography_5shot_acc_norm": 83.33, "cmmlu-professional_medicine_5shot_acc_norm": 70.96, "cmmlu-electrical_engineering_5shot_acc_norm": 65.52, "cmmlu-elementary_mathematics_5shot_acc_norm": 69.05, "cmmlu-high_school_psychology_5shot_acc_norm": 84.22, "cmmlu-high_school_statistics_5shot_acc_norm": 63.89, "cmmlu-high_school_us_history_5shot_acc_norm": 86.76, "cmmlu-high_school_mathematics_5shot_acc_norm": 44.44, "cmmlu-professional_accounting_5shot_acc_norm": 57.8, "cmmlu-professional_psychology_5shot_acc_norm": 72.55, "cmmlu-college_computer_science_5shot_acc_norm": 59, "cmmlu-high_school_world_history_5shot_acc_norm": 86.5, "cmmlu-high_school_macroeconomics_5shot_acc_norm": 78.46, "cmmlu-high_school_microeconomics_5shot_acc_norm": 79.41, "cmmlu-high_school_computer_science_5shot_acc_norm": 79, "cmmlu-high_school_european_history_5shot_acc_norm": 78.79, "cmmlu-high_school_government_and_politics_5shot_acc_norm": 90.67 } }
{}
{}
{}
{}
{ "model_name": "ConvexAI/Luminex-34B-v0.2", "model_dtype": "float16", "model_size": 0 }
{ "harness-c_arc_challenge": { "acc_norm": 65.87, "acc_stderr": 0, "c_arc_challenge_25shot_acc": 61.77, "c_arc_challenge_25shot_acc_norm": 65.87 }, "harness-c_gsm8k": { "acc": 56.41, "acc_stderr": 0, "c_gsm8k_5shot_acc": 56.41 }, "harness-c_hellaswag": { "acc_norm": 71.14, "acc_stderr": 0, "c_hellaswag_10shot_acc": 51.87, "c_hellaswag_10shot_acc_norm": 71.14 }, "harness-c-sem-v2": { "acc": 90.745, "acc_stderr": 0, "c_sem_v2-LLSRC_5shot_acc": 93.38, "c_sem_v2-SLPWC_5shot_acc": 85.86, "c_sem_v2-SLRFC_5shot_acc": 95.11, "c_sem_v2-SLSRC_5shot_acc": 88.63, "c_sem_v2-LLSRC_5shot_acc_norm": 93.38, "c_sem_v2-SLPWC_5shot_acc_norm": 85.86, "c_sem_v2-SLRFC_5shot_acc_norm": 95.11, "c_sem_v2-SLSRC_5shot_acc_norm": 88.63 }, "harness-c_truthfulqa_mc": { "mc2": 59.38, "acc_stderr": 0, "c_truthfulqa_mc_0shot_mc1": 39.78, "c_truthfulqa_mc_0shot_mc2": 59.38 }, "harness-c_winogrande": { "acc": 71.9, "acc_stderr": 0, "c_winogrande_0shot_acc": 71.9 }, "CLCC-H": { "acc": 0.7803, "acc_stderr": 0 }, "harness-cmmlu": { "acc_norm": 70.01, "acc_stderr": 0, "cmmlu-anatomy_5shot_acc": 58.52, "cmmlu_fullavg_5shot_acc": 70.01, "cmmlu-virology_5shot_acc": 54.82, "cmmlu-astronomy_5shot_acc": 82.89, "cmmlu-marketing_5shot_acc": 85.9, "cmmlu-nutrition_5shot_acc": 80.72, "cmmlu-sociology_5shot_acc": 85.07, "cmmlu-management_5shot_acc": 78.64, "cmmlu-philosophy_5shot_acc": 72.35, "cmmlu-prehistory_5shot_acc": 76.54, "cmmlu-human_aging_5shot_acc": 75.34, "cmmlu-econometrics_5shot_acc": 53.51, "cmmlu-formal_logic_5shot_acc": 43.2, "cmmlu-global_facts_5shot_acc": 53, "cmmlu-jurisprudence_5shot_acc": 79.63, "cmmlu-miscellaneous_5shot_acc": 81.23, "cmmlu-moral_disputes_5shot_acc": 75.43, "cmmlu-business_ethics_5shot_acc": 74, "cmmlu-college_biology_5shot_acc": 75.69, "cmmlu-college_physics_5shot_acc": 49.02, "cmmlu-human_sexuality_5shot_acc": 78.63, "cmmlu-moral_scenarios_5shot_acc": 67.15, "cmmlu-world_religions_5shot_acc": 74.85, "cmmlu-abstract_algebra_5shot_acc": 38, "cmmlu-college_medicine_5shot_acc": 71.1, "cmmlu-machine_learning_5shot_acc": 49.11, "cmmlu-medical_genetics_5shot_acc": 76, "cmmlu-professional_law_5shot_acc": 50.91, "cmmlu-public_relations_5shot_acc": 67.27, "cmmlu-security_studies_5shot_acc": 79.18, "cmmlu-college_chemistry_5shot_acc": 51, "cmmlu-computer_security_5shot_acc": 73, "cmmlu-international_law_5shot_acc": 86.78, "cmmlu-logical_fallacies_5shot_acc": 68.1, "cmmlu-us_foreign_policy_5shot_acc": 89, "cmmlu-clinical_knowledge_5shot_acc": 76.23, "cmmlu-conceptual_physics_5shot_acc": 71.06, "cmmlu-college_mathematics_5shot_acc": 45, "cmmlu-high_school_biology_5shot_acc": 81.61, "cmmlu-high_school_physics_5shot_acc": 47.02, "cmmlu-high_school_chemistry_5shot_acc": 60.1, "cmmlu-high_school_geography_5shot_acc": 82.32, "cmmlu-professional_medicine_5shot_acc": 72.79, "cmmlu-electrical_engineering_5shot_acc": 65.52, "cmmlu-elementary_mathematics_5shot_acc": 69.05, "cmmlu-high_school_psychology_5shot_acc": 83.67, "cmmlu-high_school_statistics_5shot_acc": 62.5, "cmmlu-high_school_us_history_5shot_acc": 86.76, "cmmlu-high_school_mathematics_5shot_acc": 44.81, "cmmlu-professional_accounting_5shot_acc": 57.09, "cmmlu-professional_psychology_5shot_acc": 71.41, "cmmlu-college_computer_science_5shot_acc": 61, "cmmlu-high_school_world_history_5shot_acc": 86.92, "cmmlu-high_school_macroeconomics_5shot_acc": 80.51, "cmmlu-high_school_microeconomics_5shot_acc": 81.09, "cmmlu-high_school_computer_science_5shot_acc": 80, "cmmlu-high_school_european_history_5shot_acc": 78.79, "cmmlu-high_school_government_and_politics_5shot_acc": 89.64, "cmmlu-anatomy_5shot_acc_norm": 58.52, "cmmlu_fullavg_5shot_acc_norm": 70.01, "cmmlu-virology_5shot_acc_norm": 54.82, "cmmlu-astronomy_5shot_acc_norm": 82.89, "cmmlu-marketing_5shot_acc_norm": 85.9, "cmmlu-nutrition_5shot_acc_norm": 80.72, "cmmlu-sociology_5shot_acc_norm": 85.07, "cmmlu-management_5shot_acc_norm": 78.64, "cmmlu-philosophy_5shot_acc_norm": 72.35, "cmmlu-prehistory_5shot_acc_norm": 76.54, "cmmlu-human_aging_5shot_acc_norm": 75.34, "cmmlu-econometrics_5shot_acc_norm": 53.51, "cmmlu-formal_logic_5shot_acc_norm": 43.2, "cmmlu-global_facts_5shot_acc_norm": 53, "cmmlu-jurisprudence_5shot_acc_norm": 79.63, "cmmlu-miscellaneous_5shot_acc_norm": 81.23, "cmmlu-moral_disputes_5shot_acc_norm": 75.43, "cmmlu-business_ethics_5shot_acc_norm": 74, "cmmlu-college_biology_5shot_acc_norm": 75.69, "cmmlu-college_physics_5shot_acc_norm": 49.02, "cmmlu-human_sexuality_5shot_acc_norm": 78.63, "cmmlu-moral_scenarios_5shot_acc_norm": 67.15, "cmmlu-world_religions_5shot_acc_norm": 74.85, "cmmlu-abstract_algebra_5shot_acc_norm": 38, "cmmlu-college_medicine_5shot_acc_norm": 71.1, "cmmlu-machine_learning_5shot_acc_norm": 49.11, "cmmlu-medical_genetics_5shot_acc_norm": 76, "cmmlu-professional_law_5shot_acc_norm": 50.91, "cmmlu-public_relations_5shot_acc_norm": 67.27, "cmmlu-security_studies_5shot_acc_norm": 79.18, "cmmlu-college_chemistry_5shot_acc_norm": 51, "cmmlu-computer_security_5shot_acc_norm": 73, "cmmlu-international_law_5shot_acc_norm": 86.78, "cmmlu-logical_fallacies_5shot_acc_norm": 68.1, "cmmlu-us_foreign_policy_5shot_acc_norm": 89, "cmmlu-clinical_knowledge_5shot_acc_norm": 76.23, "cmmlu-conceptual_physics_5shot_acc_norm": 71.06, "cmmlu-college_mathematics_5shot_acc_norm": 45, "cmmlu-high_school_biology_5shot_acc_norm": 81.61, "cmmlu-high_school_physics_5shot_acc_norm": 47.02, "cmmlu-high_school_chemistry_5shot_acc_norm": 60.1, "cmmlu-high_school_geography_5shot_acc_norm": 82.32, "cmmlu-professional_medicine_5shot_acc_norm": 72.79, "cmmlu-electrical_engineering_5shot_acc_norm": 65.52, "cmmlu-elementary_mathematics_5shot_acc_norm": 69.05, "cmmlu-high_school_psychology_5shot_acc_norm": 83.67, "cmmlu-high_school_statistics_5shot_acc_norm": 62.5, "cmmlu-high_school_us_history_5shot_acc_norm": 86.76, "cmmlu-high_school_mathematics_5shot_acc_norm": 44.81, "cmmlu-professional_accounting_5shot_acc_norm": 57.09, "cmmlu-professional_psychology_5shot_acc_norm": 71.41, "cmmlu-college_computer_science_5shot_acc_norm": 61, "cmmlu-high_school_world_history_5shot_acc_norm": 86.92, "cmmlu-high_school_macroeconomics_5shot_acc_norm": 80.51, "cmmlu-high_school_microeconomics_5shot_acc_norm": 81.09, "cmmlu-high_school_computer_science_5shot_acc_norm": 80, "cmmlu-high_school_european_history_5shot_acc_norm": 78.79, "cmmlu-high_school_government_and_politics_5shot_acc_norm": 89.64 } }
{}
{}
{}
{}
{ "model_name": "CultriX/NeuralMona_MoE-4x7B", "model_dtype": "float16", "model_size": 0 }
{ "harness-c_arc_challenge": { "acc_norm": 52.39, "acc_stderr": 0, "c_arc_challenge_25shot_acc": 48.38, "c_arc_challenge_25shot_acc_norm": 52.39 }, "harness-c_gsm8k": { "acc": 47.38, "acc_stderr": 0, "c_gsm8k_5shot_acc": 47.38 }, "harness-c_hellaswag": { "acc_norm": 60.54, "acc_stderr": 0, "c_hellaswag_10shot_acc": 45.01, "c_hellaswag_10shot_acc_norm": 60.54 }, "harness-c-sem-v2": { "acc": 61.2525, "acc_stderr": 0, "c_sem_v2-LLSRC_5shot_acc": 62.59, "c_sem_v2-SLPWC_5shot_acc": 63.86, "c_sem_v2-SLRFC_5shot_acc": 45.32, "c_sem_v2-SLSRC_5shot_acc": 73.24, "c_sem_v2-LLSRC_5shot_acc_norm": 62.59, "c_sem_v2-SLPWC_5shot_acc_norm": 63.86, "c_sem_v2-SLRFC_5shot_acc_norm": 45.32, "c_sem_v2-SLSRC_5shot_acc_norm": 73.24 }, "harness-c_truthfulqa_mc": { "mc2": 63.11, "acc_stderr": 0, "c_truthfulqa_mc_0shot_mc1": 43.57, "c_truthfulqa_mc_0shot_mc2": 63.11 }, "harness-c_winogrande": { "acc": 62.83, "acc_stderr": 0, "c_winogrande_0shot_acc": 62.83 }, "CLCC-H": { "acc": 0, "acc_stderr": 0 }, "harness-cmmlu": { "acc_norm": 46.95, "acc_stderr": 0, "cmmlu-anatomy_5shot_acc": 39.26, "cmmlu_fullavg_5shot_acc": 46.95, "cmmlu-virology_5shot_acc": 36.75, "cmmlu-astronomy_5shot_acc": 48.68, "cmmlu-marketing_5shot_acc": 72.65, "cmmlu-nutrition_5shot_acc": 50, "cmmlu-sociology_5shot_acc": 62.69, "cmmlu-management_5shot_acc": 61.17, "cmmlu-philosophy_5shot_acc": 43.41, "cmmlu-prehistory_5shot_acc": 43.21, "cmmlu-human_aging_5shot_acc": 50.22, "cmmlu-econometrics_5shot_acc": 36.84, "cmmlu-formal_logic_5shot_acc": 40.8, "cmmlu-global_facts_5shot_acc": 36, "cmmlu-jurisprudence_5shot_acc": 55.56, "cmmlu-miscellaneous_5shot_acc": 51.47, "cmmlu-moral_disputes_5shot_acc": 49.42, "cmmlu-business_ethics_5shot_acc": 51, "cmmlu-college_biology_5shot_acc": 39.58, "cmmlu-college_physics_5shot_acc": 35.29, "cmmlu-human_sexuality_5shot_acc": 44.27, "cmmlu-moral_scenarios_5shot_acc": 26.82, "cmmlu-world_religions_5shot_acc": 46.78, "cmmlu-abstract_algebra_5shot_acc": 30, "cmmlu-college_medicine_5shot_acc": 40.46, "cmmlu-machine_learning_5shot_acc": 43.75, "cmmlu-medical_genetics_5shot_acc": 43, "cmmlu-professional_law_5shot_acc": 35.66, "cmmlu-public_relations_5shot_acc": 52.73, "cmmlu-security_studies_5shot_acc": 60, "cmmlu-college_chemistry_5shot_acc": 39, "cmmlu-computer_security_5shot_acc": 63, "cmmlu-international_law_5shot_acc": 64.46, "cmmlu-logical_fallacies_5shot_acc": 55.21, "cmmlu-us_foreign_policy_5shot_acc": 49, "cmmlu-clinical_knowledge_5shot_acc": 47.55, "cmmlu-conceptual_physics_5shot_acc": 45.11, "cmmlu-college_mathematics_5shot_acc": 35, "cmmlu-high_school_biology_5shot_acc": 47.42, "cmmlu-high_school_physics_5shot_acc": 35.1, "cmmlu-high_school_chemistry_5shot_acc": 35.47, "cmmlu-high_school_geography_5shot_acc": 56.06, "cmmlu-professional_medicine_5shot_acc": 36.03, "cmmlu-electrical_engineering_5shot_acc": 45.52, "cmmlu-elementary_mathematics_5shot_acc": 37.3, "cmmlu-high_school_psychology_5shot_acc": 56.51, "cmmlu-high_school_statistics_5shot_acc": 38.89, "cmmlu-high_school_us_history_5shot_acc": 55.88, "cmmlu-high_school_mathematics_5shot_acc": 27.78, "cmmlu-professional_accounting_5shot_acc": 37.59, "cmmlu-professional_psychology_5shot_acc": 45.1, "cmmlu-college_computer_science_5shot_acc": 47, "cmmlu-high_school_world_history_5shot_acc": 64.98, "cmmlu-high_school_macroeconomics_5shot_acc": 50.77, "cmmlu-high_school_microeconomics_5shot_acc": 51.26, "cmmlu-high_school_computer_science_5shot_acc": 63, "cmmlu-high_school_european_history_5shot_acc": 63.03, "cmmlu-high_school_government_and_politics_5shot_acc": 55.44, "cmmlu-anatomy_5shot_acc_norm": 39.26, "cmmlu_fullavg_5shot_acc_norm": 46.95, "cmmlu-virology_5shot_acc_norm": 36.75, "cmmlu-astronomy_5shot_acc_norm": 48.68, "cmmlu-marketing_5shot_acc_norm": 72.65, "cmmlu-nutrition_5shot_acc_norm": 50, "cmmlu-sociology_5shot_acc_norm": 62.69, "cmmlu-management_5shot_acc_norm": 61.17, "cmmlu-philosophy_5shot_acc_norm": 43.41, "cmmlu-prehistory_5shot_acc_norm": 43.21, "cmmlu-human_aging_5shot_acc_norm": 50.22, "cmmlu-econometrics_5shot_acc_norm": 36.84, "cmmlu-formal_logic_5shot_acc_norm": 40.8, "cmmlu-global_facts_5shot_acc_norm": 36, "cmmlu-jurisprudence_5shot_acc_norm": 55.56, "cmmlu-miscellaneous_5shot_acc_norm": 51.47, "cmmlu-moral_disputes_5shot_acc_norm": 49.42, "cmmlu-business_ethics_5shot_acc_norm": 51, "cmmlu-college_biology_5shot_acc_norm": 39.58, "cmmlu-college_physics_5shot_acc_norm": 35.29, "cmmlu-human_sexuality_5shot_acc_norm": 44.27, "cmmlu-moral_scenarios_5shot_acc_norm": 26.82, "cmmlu-world_religions_5shot_acc_norm": 46.78, "cmmlu-abstract_algebra_5shot_acc_norm": 30, "cmmlu-college_medicine_5shot_acc_norm": 40.46, "cmmlu-machine_learning_5shot_acc_norm": 43.75, "cmmlu-medical_genetics_5shot_acc_norm": 43, "cmmlu-professional_law_5shot_acc_norm": 35.66, "cmmlu-public_relations_5shot_acc_norm": 52.73, "cmmlu-security_studies_5shot_acc_norm": 60, "cmmlu-college_chemistry_5shot_acc_norm": 39, "cmmlu-computer_security_5shot_acc_norm": 63, "cmmlu-international_law_5shot_acc_norm": 64.46, "cmmlu-logical_fallacies_5shot_acc_norm": 55.21, "cmmlu-us_foreign_policy_5shot_acc_norm": 49, "cmmlu-clinical_knowledge_5shot_acc_norm": 47.55, "cmmlu-conceptual_physics_5shot_acc_norm": 45.11, "cmmlu-college_mathematics_5shot_acc_norm": 35, "cmmlu-high_school_biology_5shot_acc_norm": 47.42, "cmmlu-high_school_physics_5shot_acc_norm": 35.1, "cmmlu-high_school_chemistry_5shot_acc_norm": 35.47, "cmmlu-high_school_geography_5shot_acc_norm": 56.06, "cmmlu-professional_medicine_5shot_acc_norm": 36.03, "cmmlu-electrical_engineering_5shot_acc_norm": 45.52, "cmmlu-elementary_mathematics_5shot_acc_norm": 37.3, "cmmlu-high_school_psychology_5shot_acc_norm": 56.51, "cmmlu-high_school_statistics_5shot_acc_norm": 38.89, "cmmlu-high_school_us_history_5shot_acc_norm": 55.88, "cmmlu-high_school_mathematics_5shot_acc_norm": 27.78, "cmmlu-professional_accounting_5shot_acc_norm": 37.59, "cmmlu-professional_psychology_5shot_acc_norm": 45.1, "cmmlu-college_computer_science_5shot_acc_norm": 47, "cmmlu-high_school_world_history_5shot_acc_norm": 64.98, "cmmlu-high_school_macroeconomics_5shot_acc_norm": 50.77, "cmmlu-high_school_microeconomics_5shot_acc_norm": 51.26, "cmmlu-high_school_computer_science_5shot_acc_norm": 63, "cmmlu-high_school_european_history_5shot_acc_norm": 63.03, "cmmlu-high_school_government_and_politics_5shot_acc_norm": 55.44 } }
{}
{}
{}
{}
End of preview.
README.md exists but content is empty. Use the Edit dataset card button to edit it.
Downloads last month
896

Space using open-cn-llm-leaderboard/results 1