ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2405.20675
  4. Cited By
Adv-KD: Adversarial Knowledge Distillation for Faster Diffusion Sampling

Adv-KD: Adversarial Knowledge Distillation for Faster Diffusion Sampling

31 May 2024
Kidist Amde Mekonnen
Nicola DallÁsen
Paolo Rota
ArXivPDFHTML

Papers citing "Adv-KD: Adversarial Knowledge Distillation for Faster Diffusion Sampling"

2 / 2 papers shown
Title
Knowledge Distillation in Iterative Generative Models for Improved
  Sampling Speed
Knowledge Distillation in Iterative Generative Models for Improved Sampling Speed
Eric Luhman
Troy Luhman
DiffM
195
258
0
07 Jan 2021
A Style-Based Generator Architecture for Generative Adversarial Networks
A Style-Based Generator Architecture for Generative Adversarial Networks
Tero Karras
S. Laine
Timo Aila
303
10,368
0
12 Dec 2018
1