File size: 1,613 Bytes
7e5312c 0d76fe0 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 |
---
base_model:
- meta-llama/Llama-3.3-70B-Instruct
---
BF16:
```
arc_challenge
{'alias': 'arc_challenge', 'acc,none': 0.5332764505119454, 'acc_stderr,none': 0.014578995859605814, 'acc_norm,none': 0.5324232081911263, 'acc_norm_stderr,none': 0.014580637569995418}
arc_easy
{'alias': 'arc_easy', 'acc,none': 0.7558922558922558, 'acc_stderr,none': 0.008814322157999389, 'acc_norm,none': 0.6435185185185185, 'acc_norm_stderr,none': 0.009828046544504433}
hellaswag
{'alias': 'hellaswag', 'acc,none': 0.5852419836685919, 'acc_stderr,none': 0.004916733258140271, 'acc_norm,none': 0.6487751443935471, 'acc_norm_stderr,none': 0.00476377498183474}
piqa
{'alias': 'piqa', 'acc,none': 0.8003264417845484, 'acc_stderr,none': 0.009326942154519176, 'acc_norm,none': 0.7899891186071817, 'acc_norm_stderr,none': 0.00950335330581858}
```
This model:
```
arc_challenge
{'alias': 'arc_challenge', 'acc,none': 0.5341296928327645, 'acc_stderr,none': 0.014577311315231102, 'acc_norm,none': 0.515358361774744, 'acc_norm_stderr,none': 0.014604496129394911}
arc_easy
{'alias': 'arc_easy', 'acc,none': 0.7470538720538721, 'acc_stderr,none': 0.008919862739165613, 'acc_norm,none': 0.6325757575757576, 'acc_norm_stderr,none': 0.009892552616211553}
hellaswag
{'alias': 'hellaswag', 'acc,none': 0.5776737701653057, 'acc_stderr,none': 0.004929204864315951, 'acc_norm,none': 0.6430989842660825, 'acc_norm_stderr,none': 0.004781061390873893}
piqa
{'alias': 'piqa', 'acc,none': 0.7986942328618063, 'acc_stderr,none': 0.00935543109899043, 'acc_norm,none': 0.8008705114254625, 'acc_norm_stderr,none': 0.009317391893706867}
``` |