Abstract

Posterior collapse, also known as the Kullback-Leibler (KL) vanishing, is a long-standing problem in variational recurrent autoencoder (VRAE) which is essentially developed for sequence generation. To alleviate the vanishing problem, a complicated latent variable is required instead of assuming it as standard Gaussian. Normalizing flow was proposed to build the bijective neural network which converts a simple distribution into a complex distribution. The resulting approximate posterior is closer to real posterior for better sequence generation. The KL divergence in learning objective is accordingly preserved to enrich the capability of generating the diverse sequences. This paper presents the flow-based VRAE to build the disentangled latent representation for sequence generation. KL preserving flows are exploited for conditional VRAE and evaluated for text representation as well as dialogue generation. In the im-plementation, the schemes of amortized regularization and skip connection are further imposed to strengthen the embedding and prediction. Experiments on different tasks show the merit of this latent variable representation for language modeling, sentiment classification and dialogue generation.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.