178

Gaussian Auto-Encoder

Abstract

Generative AutoEncoders require a chosen probability distribution for latent variables, usually multivariate Gaussian. The original Variational AutoEncoder (VAE) only tested KL divergence for separate points - directly not ensuring their uniform coverage of the probability density. It was later improved by adding some pairwise repulsion in methods using Wasserstain metric (WAE, CWAE) what required random sampling and approximations, or average L2L_2 distance between 1D projections of Gaussian-smoothened samples (CWAE) - finally getting inexpensive non-random analytic formula. However, their agreement with Gaussian distribution is based on heuristic argumentation, usually supported only by verifying two moments. More accurate evaluation would be testing agreement of CDF of radii and pairwise distances like in Kolmogorov-Smirnov test. To directly optimize this natural criterion, there will be presented approach literally attracting empirical distribution function by the desired CDF e.g. of multivariate Gaussian distribution - to be used to ensure this distribution in latent space of AutoEncoder.

View on arXiv
Comments on this paper