Abstract

Recent advancements in generative networks have shown that it is possible to produce real-world-like data using deep neural networks. Some implicit probabilistic models that follow a stochastic procedure to directly generate data have been introduced to overcome the intractability of the posterior distribution. However, the ability to model data requires deep knowledge and understanding of its statistical dependence-which can be preserved and studied in appropriate latent spaces. In this article, we present a segmented generation process through linear and nonlinear manipulations in the same-dimensional latent space where data are projected to. Inspired by the known stochastic method to generate correlated data, we develop a segmented approach for the generation of dependent data, exploiting the concept of copula. The generation process is split into two frames: one embedding the covariance or copula information in the uniform probability space, and the other embedding the marginal distribution information in the sample domain. The proposed network structure, referred to as a segmented generative network (SGN), also provides an empirical method to sample directly from implicit copulas. To show its generality, we evaluate the presented approach in three application scenarios: a toy example, handwritten digits, and face image generation.

Highlights

  • D EEP learning literature has grown significantly over the last years

  • We refer to such methodology as segmented generative network (SGN), in particular, SGN-C when the process targets the generation of data with the same correlation of the samples in the data set, SGNs modeling the dependence (SGN-D) instead, when the generation process targets the attainment of the full statistical dependence among data

  • We qualitatively evaluate the generation performance of the SGN-C approach and the SGN-D approach

Read more

Summary

INTRODUCTION

D EEP learning literature has grown significantly over the last years. The key to its success lies on the extended availability of labeled data, increased computational power, and GPUs. We want to show how both linear and nonlinear dependence in collected data can be exploited in a different latent space, in detail, the uniform probability space (UPS).1 This enables a deeper understanding of the underlying true distribution. We propose a combined generative-like model that splits and segments the data generation process into two frames: one embedding the covariance or copula information in the uniform space, and the other embedding the marginal distribution information in the sample domain We refer to such methodology as segmented generative network (SGN), in particular, SGN-C when the process targets the generation of data with the same correlation of the samples in the data set, SGN-D instead, when the generation process targets the attainment of the full statistical dependence among data.

Definitions and Notation
RELATED WORK
PROPOSED APPROACH
Transform Sampling
EVALUATION OF RESULTS
Qualitative Evaluation
Quantitative Evaluation
CONCLUSION
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.