Abstract

<abstract><p>Solving arithmetic word problems ($ AWPs $) that involve deep implicit relations can be quite challenging. However, the paper proposed two approaches to tackle this issue. The first approach used the modifier-to-matrix ($ MTM $) model to extract noun modification components from the problem text. Specifically, a missing entity recovery ($ MER $) model translated explicit expressions into a node dependency graph ($ NDG $). The nodes on the graph then recursively acquired connections from the knowledge base through the $ MER $ model until the goal was achieved with a known quantity. The solving engine then selected the appropriate knowledge as the prompt. The second approach proposed in the paper was a comprehensive one that combined explicit and implicit knowledge to enhance reasoning abilities. The experimental results of the dataset demonstrate that the proposed algorithm is superior to the baseline algorithms in solving $ AWPs $ that require deep implicit relations.</p></abstract>

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call