Abstract

This paper introduces a novel framework for generative models based on Restricted Kernel Machines (RKMs) with joint multi-view generation and uncorrelated feature learning, called Gen-RKM. To enable joint multi-view generation, this mechanism uses a shared representation of data from various views. Furthermore, the model has a primal and dual formulation to incorporate both kernel-based and (deep convolutional) neural network based models within the same setting. When using neural networks as explicit feature-maps, a novel training procedure is proposed, which jointly learns the features and shared subspace representation. The latent variables are given by the eigen-decomposition of the kernel matrix, where the mutual orthogonality of eigenvectors represents the learned uncorrelated features. Experiments demonstrate the potential of the framework through qualitative and quantitative evaluation of generated samples on various standard datasets.

Highlights

  • In the past decade, interest in generative models has grown tremendously, finding applications in multiple fields such as, generated art, on-demand video, image denoising [1], exploration in reinforcement learning [2], collaborative filtering [3], in-painting [4] and many more

  • Generative Adversarial Networks (GANs) are considered as the state-of-the-art for generative modeling tasks, producing high-quality images but are more difficult to train due to unstable training dynamics, unless more sophisticated variants are applied

  • We propose a novel generative mechanism based on the framework of Restricted Kernel Machines (RKMs) [23], called Generative-RKM (Gen-RKM)

Read more

Summary

Introduction

Interest in generative models has grown tremendously, finding applications in multiple fields such as, generated art, on-demand video, image denoising [1], exploration in reinforcement learning [2], collaborative filtering [3], in-painting [4] and many more. Each view could individually be used for learning tasks, exploiting information from all views together could improve the learning quality [10, 11, 12] It is among the goals of the latent variable modelling to model the description of data in terms of uncorrelated or independent components. The definition of disentanglement in the literature is not precise, many believe that a representation with statistically independent variables is a good starting point [16, 17] Such representations extract information into a compact form which makes it possible to generate samples with specific characteristics [18, 19, 20, 21]. Thanks to the orthogonality of eigenvectors of the kernel matrix, the learned latent variables are uncorrelated This resembles a disentangled representation, which makes it possible to generate data with specific characteristics

Related Work
Training phase of the RKM
Generation
Implicit feature map
Explicit Feature map
The Gen-RKM Algorithm
Experiments
2: Select h
Conclusion and future work
A Derivation of Gen-RKM objective function
Computing latent variables using covariance matrix
B Stabilizing the objective function
D Architecture details
F Visualizing the disentanglement metric
Illustration on toy example using a Gaussian kernel
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call