Abstract

Flow-based generative models are a family of exact log-likelihood models with tractable sampling and latent-variable inference, hence conceptually attractive for modeling complex distributions. However, flow-based models are limited by density estimation performance issues as compared to state-of-the-art autoregressive models. Autoregressive models, which also belong to the family of likelihood-based methods, however suffer from limited parallelizability. In this paper, we propose Dynamic Linear Flow (DLF) , a new family of invertible transformations with partially autoregressive structure. Our method benefits from the efficient computation of flow-based methods and high density estimation performance of autoregressive methods. We demonstrate that the proposed DLF yields state-of-the-art performance on ImageNet $32\times 32$ and $64\times 64$ out of all flow-based methods. Additionally, DLF converges significantly faster than previous flow-based methods such as Glow.

Highlights

  • The increasing amount of data, paired with the exponential progress in the capabilities of hardware and relentless efforts for better methods, has tremendously advanced the development in the fields of deep learning, such as image classification [1]–[3] and machine translation [4]–[6]

  • Similar to the transformations of AR and Inverse Autoregressive (IAR), we introduce a variant of dynamic linear transformation

  • In order to increase the capability of the model, we describe Dynamic Linear Flow (DLF), a flow-based model using the dynamic linear transformation as a building block

Read more

Summary

INTRODUCTION

The increasing amount of data, paired with the exponential progress in the capabilities of hardware and relentless efforts for better methods, has tremendously advanced the development in the fields of deep learning, such as image classification [1]–[3] and machine translation [4]–[6]. Likelihood-based generative methods could be further divided into three different categories: variational autoencoders [10], autoregressive models [11]–[14], and flow-based generative methods [15]–[17]. Autoregressive models and flow-based generative models both estimate the exact likelihood of the data. Flow-based generative models are efficient for training and synthesis, but generally yield compromised performance in comparison with autoregressive models in density estimation benchmarks. We illustrate that autoregressive models and flow-based generative models are two extreme forms of our proposed method. Liao et al.: Generative Model With DLF than one second, which is comparable to most flow-based methods

BACKGROUND
AUTOREGRESSIVE AND INVERSE AUTOREGRESSIVE TRANSFORMATIONS
CONDITIONAL DYNAMIC LINEAR TRANSFORMATION
DYNAMIC LINEAR FLOW
RELATED WORK
EXPERIMENTS
Findings
CONCLUSION
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call