ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2504.00820
36
0

Deep Generative Models: Complexity, Dimensionality, and Approximation

1 April 2025
Kevin Wang
Hongqian Niu
Yixin Wang
Didong Li
    DRL
ArXivPDFHTML
Abstract

Generative networks have shown remarkable success in learning complex data distributions, particularly in generating high-dimensional data from lower-dimensional inputs. While this capability is well-documented empirically, its theoretical underpinning remains unclear. One common theoretical explanation appeals to the widely accepted manifold hypothesis, which suggests that many real-world datasets, such as images and signals, often possess intrinsic low-dimensional geometric structures. Under this manifold hypothesis, it is widely believed that to approximate a distribution on a ddd-dimensional Riemannian manifold, the latent dimension needs to be at least ddd or d+1d+1d+1. In this work, we show that this requirement on the latent dimension is not necessary by demonstrating that generative networks can approximate distributions on ddd-dimensional Riemannian manifolds from inputs of any arbitrary dimension, even lower than ddd, taking inspiration from the concept of space-filling curves. This approach, in turn, leads to a super-exponential complexity bound of the deep neural networks through expanded neurons. Our findings thus challenge the conventional belief on the relationship between input dimensionality and the ability of generative networks to model data distributions. This novel insight not only corroborates the practical effectiveness of generative networks in handling complex data structures, but also underscores a critical trade-off between approximation error, dimensionality, and model complexity.

View on arXiv
@article{wang2025_2504.00820,
  title={ Deep Generative Models: Complexity, Dimensionality, and Approximation },
  author={ Kevin Wang and Hongqian Niu and Yixin Wang and Didong Li },
  journal={arXiv preprint arXiv:2504.00820},
  year={ 2025 }
}
Comments on this paper