Abstract

In the modern world, structured and semi-structured knowledge bases hold a considerable amount of data. There-fore, people who are familiar with formal query languages should not be the only ones who can efficiently and clearly query them. Semantic Parsing (SP) is converting natural language utterances into formal meaning representations. The paper suggests a model for SP that uses a novel method of utilizing the Semi-Supervised Generative Adversarial Network (SS-GAN) to enhance the classifier performance. The proposed SS-GAN extends the fine-tuning of word embedding architectures using unlabeled examples in a generative adversarial environment. We provide a regularization strategy for addressing the mode missing problem and unstable training in SS-GAN. The main viewpoint is to use the extracted feature vectors from the discriminator. Hence, the generator produces outputs by aiding the discriminator’s learned features. A reconstruction loss is added to the loss function of the SS-GAN to drive the genera-tor to reconstruct outputs from the discriminator’s features, hence steering the generator toward actual data configurations. The proposed reconstruction loss improves the performance of SS-GAN, produces high-quality outputs, and may be combined with other regularization loss functions to improve the performance of diverse GANs. We employ BERT word embedding for our model, which can be included in a downstream task and fine-tuned as a model, while the pre-trained BERT model can capture various linguistic properties. We examine the suggested model using the WikiSQL and SparC datasets, and the analysis findings reveal our model outperforms its rivals. The findings from our experiments indicate that the need for labeled samples can be minimized, down to as few as 100 instances, while still achieving commendable classification outcomes.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.