Abstract

Automatic solving math word problems (MWPs) is a number-intensive application in natural language processing (NLP). However, these existing methods are far from achieving acceptable levels of numeracy learning. As a result, the performance of these models is limited for mathematical reasoning. In addition, the mainstream tree decoder suffers from early-stage information loss, resulting in an unsatisfactory performance for complex problems with more operators. In this paper, we propose NERHRT (Number-Enhanced Representation with Hierarchical Recursive Tree Decoding), a simple yet effective number embedding method that produces the numerical reasoning via the decimal notation-based embedding and a dual-direction graph attention network. In addition, a hierarchical recursive tree-structured decoder is introduced to aggregate information from all ancestor nodes. Experiments show that our approach obtains the best performance on four popular benchmark datasets, and beats the state-of-the-art models with a large margin.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call