Abstract

Table-based fact verification (TFV) is a binary classification task that requires understanding and reasoning about both table and text. This task poses many challenges, such as table parsing, text comprehension, and numerical reasoning. However, existing methods tend to depend solely on pre-trained models for tables, treating all types of reasoning equally and disregarding the importance of identifying logic types in inference process. In this regard, we propose MoETFV, an efficient and explanatory approach to solving TFV, which is based on a Mixture-of-Experts (MoE) framework. This approach can detect the underlying logic types of statements and leverage multiple independent experts to emulate diverse logical reasoning. It consists of one shared expert for general semantic understanding and several specific experts with distinct responsibilities for different logical inferences. Moreover, the practical applications of the MoE method in TFV are thoroughly investigated. This model doesn’t necessitate any table pre-trained models, and aligns closely with human cognitive processes in addressing such issues. Experimental results demonstrate the innovation and feasibility of the proposed approach.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call