File size: 1,270 Bytes
003bb45
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
---
license: llama2
datasets:
- gair-prox/open-web-math-pro
language:
- en
pipeline_tag: text-generation
tags:
- llama2
- math
- reasoning
base_model:
- meta-llama/Llama-2-7b-hf
---

# Llama-2-7B-ProXMath

<p align="center">
  <img src="prox-teaser.png">
</p>

[ArXiv](http://arxiv.org/abs/xxxx) | [Data: OpenWebMath-Pro](https://huggingface.co/datasets/gair-prox/open-web-math-pro) | [Code](https://github.com/GAIR-NLP/program-every-example)

**Llama-2-7B-ProXMath** is a math-adapted Llama-2-7B model that is continually pre-trained on [OpenWebMath-Pro](https://huggingface.co/datasets/gair-prox/open-web-math-pro) (a refined version by ProX) for **10**B tokens.

## Evaluations

ProX models are evaluated on 9 common math reasoning benchmarks.

| Model               | asdiv | gsm8k | mathqa | mawps | minerva_math | mmlu_stem | sat_math | svamp | tabmwp | average |
|---------------------|:-----:|:-----:|:------:|:-----:|:------------:|:---------:|:--------:|:-----:|:------:|:-------:|
| Llama-2-7B          |  51.6 |  14.1 |  12.5  |  63.6 |      3.8     |    32.9   |   34.4   |  39.5 |  30.9  |  31.48  |
| Llama-2-7B-ProXMath |  63.7 |  30.6 |  40.1  |  79.3 |     16.8     |    43.8   |   53.1   |  50.2 |  37.3  |   46.1  |

### Citation
```
@misc{TBD
}
```