Chalermpun's picture
Add results/MC/Meta-Llama-3.1-405B-Instruct-AWQ-INT4/results.json to result queue
c0681e8 verified
raw
history blame contribute delete
204 Bytes
{"config": {"model_name": "hugging-quants/Meta-Llama-3.1-405B-Instruct-AWQ-INT4"}, "results": {"thaiexam_qa": {"accuracy": 0.6353982300884956}, "m3exam_tha_seacrowd_qa": {"accuracy": 0.6166974169741697}}}