ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2306.05544
  4. Cited By
BOOT: Data-free Distillation of Denoising Diffusion Models with
  Bootstrapping

BOOT: Data-free Distillation of Denoising Diffusion Models with Bootstrapping

8 June 2023
Jiatao Gu
Shuangfei Zhai
Yizhe Zhang
Lingjie Liu
J. Susskind
    DiffM
ArXivPDFHTML

Papers citing "BOOT: Data-free Distillation of Denoising Diffusion Models with Bootstrapping"

14 / 14 papers shown
Title
Integration Flow Models
Integration Flow Models
Jingjing Wang
Dan Zhang
Joshua Luo
Yin Yang
Feng Luo
235
0
0
28 Apr 2025
Fast Autoregressive Models for Continuous Latent Generation
Fast Autoregressive Models for Continuous Latent Generation
Tiankai Hang
Jianmin Bao
Fangyun Wei
Dong Chen
DiffM
80
0
0
24 Apr 2025
Self-Corrected Flow Distillation for Consistent One-Step and Few-Step Text-to-Image Generation
Self-Corrected Flow Distillation for Consistent One-Step and Few-Step Text-to-Image Generation
Quan Dao
Hao Phung
T. Dao
Dimitris Metaxas
Anh Tran
103
1
0
22 Dec 2024
DKDM: Data-Free Knowledge Distillation for Diffusion Models with Any Architecture
DKDM: Data-Free Knowledge Distillation for Diffusion Models with Any Architecture
Qianlong Xiang
Miao Zhang
Yuzhang Shang
Jianlong Wu
Yan Yan
Liqiang Nie
DiffM
66
10
0
05 Sep 2024
Improving the Training of Rectified Flows
Improving the Training of Rectified Flows
Sangyun Lee
Zinan Lin
Giulia Fanti
53
20
0
30 May 2024
PeRFlow: Piecewise Rectified Flow as Universal Plug-and-Play Accelerator
PeRFlow: Piecewise Rectified Flow as Universal Plug-and-Play Accelerator
Hanshu Yan
Xingchao Liu
Jiachun Pan
Jun Hao Liew
Qiang Liu
Jiashi Feng
49
41
0
13 May 2024
Distilling Diffusion Models into Conditional GANs
Distilling Diffusion Models into Conditional GANs
Minguk Kang
Richard Zhang
Connelly Barnes
Sylvain Paris
Suha Kwak
Jaesik Park
Eli Shechtman
Jun-Yan Zhu
Taesung Park
48
37
0
09 May 2024
Trajectory Consistency Distillation: Improved Latent Consistency
  Distillation by Semi-Linear Consistency Function with Trajectory Mapping
Trajectory Consistency Distillation: Improved Latent Consistency Distillation by Semi-Linear Consistency Function with Trajectory Mapping
Jianbin Zheng
Minghui Hu
Zhongyi Fan
Chaoyue Wang
Changxing Ding
Dacheng Tao
Tat-Jen Cham
48
27
0
29 Feb 2024
Fast Sampling Through The Reuse Of Attention Maps In Diffusion Models
Fast Sampling Through The Reuse Of Attention Maps In Diffusion Models
Rosco Hunter
Łukasz Dudziak
Mohamed S. Abdelfattah
Abhinav Mehrotra
Sourav Bhattacharya
Hongkai Wen
25
1
0
13 Dec 2023
Continual Learning of Diffusion Models with Generative Distillation
Continual Learning of Diffusion Models with Generative Distillation
Sergi Masip
Pau Rodriguez
Tinne Tuytelaars
Gido M. van de Ven
VLM
DiffM
48
8
0
23 Nov 2023
Distilling Step-by-Step! Outperforming Larger Language Models with Less
  Training Data and Smaller Model Sizes
Distilling Step-by-Step! Outperforming Larger Language Models with Less Training Data and Smaller Model Sizes
Lokesh Nagalapatti
Chun-Liang Li
Chih-Kuan Yeh
Hootan Nakhost
Yasuhisa Fujii
Alexander Ratner
Ranjay Krishna
Chen-Yu Lee
Tomas Pfister
ALM
224
506
0
03 May 2023
f-DM: A Multi-stage Diffusion Model via Progressive Signal
  Transformation
f-DM: A Multi-stage Diffusion Model via Progressive Signal Transformation
Jiatao Gu
Shuangfei Zhai
Yizhe Zhang
Miguel Angel Bautista
J. Susskind
DiffM
58
26
0
10 Oct 2022
Diffusion-LM Improves Controllable Text Generation
Diffusion-LM Improves Controllable Text Generation
Xiang Lisa Li
John Thickstun
Ishaan Gulrajani
Percy Liang
Tatsunori B. Hashimoto
AI4CE
173
781
0
27 May 2022
Knowledge Distillation in Iterative Generative Models for Improved
  Sampling Speed
Knowledge Distillation in Iterative Generative Models for Improved Sampling Speed
Eric Luhman
Troy Luhman
DiffM
195
262
0
07 Jan 2021
1