ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.19429
  4. Cited By
Quantifying the Ease of Reproducing Training Data in Unconditional Diffusion Models

Quantifying the Ease of Reproducing Training Data in Unconditional Diffusion Models

25 March 2025
Masaya Hasegawa
Koji Yasuda
ArXivPDFHTML

Papers citing "Quantifying the Ease of Reproducing Training Data in Unconditional Diffusion Models"

6 / 6 papers shown
Title
Hierarchical Text-Conditional Image Generation with CLIP Latents
Hierarchical Text-Conditional Image Generation with CLIP Latents
Aditya A. Ramesh
Prafulla Dhariwal
Alex Nichol
Casey Chu
Mark Chen
VLM
DiffM
370
6,854
0
13 Apr 2022
Score-Based Generative Modeling through Stochastic Differential
  Equations
Score-Based Generative Modeling through Stochastic Differential Equations
Yang Song
Jascha Narain Sohl-Dickstein
Diederik P. Kingma
Abhishek Kumar
Stefano Ermon
Ben Poole
DiffM
SyDa
325
6,444
0
26 Nov 2020
Denoising Diffusion Implicit Models
Denoising Diffusion Implicit Models
Jiaming Song
Chenlin Meng
Stefano Ermon
VLM
DiffM
247
7,350
0
06 Oct 2020
Generative Modeling by Estimating Gradients of the Data Distribution
Generative Modeling by Estimating Gradients of the Data Distribution
Yang Song
Stefano Ermon
SyDa
DiffM
228
3,893
0
12 Jul 2019
Deep Unsupervised Learning using Nonequilibrium Thermodynamics
Deep Unsupervised Learning using Nonequilibrium Thermodynamics
Jascha Narain Sohl-Dickstein
Eric A. Weiss
Niru Maheswaranathan
Surya Ganguli
SyDa
DiffM
288
6,925
0
12 Mar 2015
What Regularized Auto-Encoders Learn from the Data Generating
  Distribution
What Regularized Auto-Encoders Learn from the Data Generating Distribution
Guillaume Alain
Yoshua Bengio
OOD
DRL
64
502
0
18 Nov 2012
1