ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2107.12466
30
6

High-Dimensional Distribution Generation Through Deep Neural Networks

26 July 2021
Dmytro Perekrestenko
Léandre Eberhard
Helmut Bölcskei
    OOD
ArXivPDFHTML
Abstract

We show that every ddd-dimensional probability distribution of bounded support can be generated through deep ReLU networks out of a 111-dimensional uniform input distribution. What is more, this is possible without incurring a cost - in terms of approximation error measured in Wasserstein-distance - relative to generating the ddd-dimensional target distribution from ddd independent random variables. This is enabled by a vast generalization of the space-filling approach discovered in (Bailey & Telgarsky, 2018). The construction we propose elicits the importance of network depth in driving the Wasserstein distance between the target distribution and its neural network approximation to zero. Finally, we find that, for histogram target distributions, the number of bits needed to encode the corresponding generative network equals the fundamental limit for encoding probability distributions as dictated by quantization theory.

View on arXiv
Comments on this paper