--- license: afl-3.0 --- # MWP-BERT NAACL 2022 Findings Paper: MWP-BERT: Numeracy-Augmented Pre-training for Math Word Problem Solving [![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/mwp-bert-a-strong-baseline-for-math-word/math-word-problem-solving-on-mathqa)](https://paperswithcode.com/sota/math-word-problem-solving-on-mathqa?p=mwp-bert-a-strong-baseline-for-math-word) [![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/mwp-bert-a-strong-baseline-for-math-word/math-word-problem-solving-on-math23k)](https://paperswithcode.com/sota/math-word-problem-solving-on-math23k?p=mwp-bert-a-strong-baseline-for-math-word) Github link: https://github.com/LZhenwen/MWP-BERT/ Please use the tokenizer of "hfl/chinese-bert-wwm-ext" for this model. ## Citation ``` @inproceedings{liang2022mwp, title={MWP-BERT: Numeracy-Augmented Pre-training for Math Word Problem Solving}, author={Liang, Zhenwen and Zhang, Jipeng and Wang, Lei and Qin, Wei and Lan, Yunshi and Shao, Jie and Zhang, Xiangliang}, booktitle={Findings of NAACL 2022}, pages={997--1009}, year={2022} } ```