Abstract

AbstractLogical table‐to‐text generation is a task within the realm of natural language generation (NLG) that aims to generate coherent and logically faithful sentences based on tables. Unlike conventional NLG tasks, this task demands not only surface‐level fluency but also a high degree of logic‐level fidelity in the generated outputs. Current table‐to‐text systems grapple with various quality issues, such as repetitive generation, insufficient reasoning and limited complexity. Therefore, we introduce LogicMoE, a dedicated Mixture‐of‐Experts (MoE) model tailored for logical table‐to‐text generation. The primary objective of LogicMoE is to enrich the diversity of generated sentences from both semantic and logical perspectives. In particular, each expert within the model serves as a specialized generator responsible for generating sentences of a specific logical type. Additionally, we propose and employ novel evaluation metrics to comprehensively assess the diversity of generated outputs. Our experimental results showcase LogicMoE's superiority with absolute improvements of 0.8 and 2.2 in BLEU‐3 over the strong baselines on LogicNLG and Logic2Text datasets, respectively, driving the state‐of‐the‐art performance to a new level. Furthermore, we highlight its inherent advantages in terms of diversity and controllability, signifying its potential to spearhead advancements in logical table‐to‐text generation applications.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call