Chalermpun's picture
Add results/MC/Gemma2-9b-WangchanLIONv2-instruct/results.json to result queue
28fa662 verified
raw
history blame
197 Bytes
{"config": {"model_name": "aisingapore/Gemma2-9b-WangchanLIONv2-instruct"}, "results": {"m3exam_tha_seacrowd_qa": {"accuracy": 0.4501845018450184}, "thaiexam_qa": {"accuracy": 0.3858407079646018}}}