Abstract

Automatically solving math word problems (MWPs) is a challenging task for artificial intelligence (AI) and machine learning (ML) research, which aims to answer the problem with a mathematical expression. Many existing solutions simply model the MWP as a sequence of words, which is far from precise solving. To this end, we turn to how humans solve MWPs. Humans read the problem part-by-part and capture dependencies between words for a thorough understanding and infer the expression precisely in a goal-driven manner with knowledge. Moreover, humans can associate different MWPs to help solve the target with related experience. In this article, we present a focused study on an MWP solver by imitating such procedure. Specifically, we first propose a novel hierarchical math solver (HMS) to exploit semantics in one MWP. First, to imitate human reading habits, we propose a novel encoder to learn the semantics guided by dependencies between words following a hierarchical "word-clause-problem" paradigm. Next, we develop a goal-driven tree-based decoder with knowledge application to generate the expression. One step further, to imitate human associating different MWPs for related experience in problem-solving, we extend HMS to the Relation-enHanced Math Solver (RHMS) to utilize the relation between MWPs. First, to capture the structural similarity relation, we develop a meta-structure tool to measure the similarity based on the logical structure of MWPs and construct a graph to associate related MWPs. Then, based on the graph, we learn an improved solver to exploit related experience for higher accuracy and robustness. Finally, we conduct extensive experiments on two large datasets, which demonstrates the effectiveness of the two proposed methods and the superiority of RHMS.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call