MWP-BERT

NAACL 2022 Findings Paper: MWP-BERT: Numeracy-Augmented Pre-training for Math Word Problem Solving

PWC PWC

Github link: https://github.com/LZhenwen/MWP-BERT/

Please use the tokenizer of "hfl/chinese-bert-wwm-ext" for this model.

Citation

@inproceedings{liang2022mwp,
  title={MWP-BERT: Numeracy-Augmented Pre-training for Math Word Problem Solving},
  author={Liang, Zhenwen and Zhang, Jipeng and Wang, Lei and Qin, Wei and Lan, Yunshi and Shao, Jie and Zhang, Xiangliang},
  booktitle={Findings of NAACL 2022},
  pages={997--1009},
  year={2022}
}
Downloads last month
1
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.