Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2305.12954
Cited By
Is Synthetic Data From Diffusion Models Ready for Knowledge Distillation?
22 May 2023
Zheng Li
Yuxuan Li
Penghai Zhao
Renjie Song
Xiang Li
Jian Yang
Re-assign community
ArXiv (abs)
PDF
HTML
Github (47★)
Papers citing
"Is Synthetic Data From Diffusion Models Ready for Knowledge Distillation?"
44 / 44 papers shown
Title
D-Feat Occlusions: Diffusion Features for Robustness to Partial Visual Occlusions in Object Recognition
Rupayan Mallick
Sibo Dong
Nataniel Ruiz
Sarah Adel Bargal
DiffM
244
0
0
08 Apr 2025
Synthetic Data from Diffusion Models Improves ImageNet Classification
Shekoofeh Azizi
Simon Kornblith
Chitwan Saharia
Mohammad Norouzi
David J. Fleet
VLM
DiffM
112
315
0
17 Apr 2023
ImageReward: Learning and Evaluating Human Preferences for Text-to-Image Generation
Jiazheng Xu
Xiao Liu
Yuchen Wu
Yuxuan Tong
Qinkai Li
Ming Ding
Jie Tang
Yuxiao Dong
152
408
0
12 Apr 2023
Effective Data Augmentation With Diffusion Models
Brandon Trabucco
Kyle Doherty
Max Gurinas
Ruslan Salakhutdinov
DiffM
VLM
103
256
0
07 Feb 2023
Scalable Diffusion Models with Transformers
William S. Peebles
Saining Xie
GNN
124
2,436
0
19 Dec 2022
Curriculum Temperature for Knowledge Distillation
Zheng Li
Xiang Li
Lingfeng Yang
Borui Zhao
Renjie Song
Lei Luo
Jun Yu Li
Jian Yang
66
144
0
29 Nov 2022
DPM-Solver++: Fast Solver for Guided Sampling of Diffusion Probabilistic Models
Cheng Lu
Yuhao Zhou
Fan Bao
Jianfei Chen
Chongxuan Li
Jun Zhu
DiffM
193
615
0
02 Nov 2022
Is synthetic data from generative models ready for image recognition?
Ruifei He
Shuyang Sun
Xin Yu
Chuhui Xue
Wenqing Zhang
Philip Torr
Song Bai
Xiaojuan Qi
104
302
0
14 Oct 2022
ViTKD: Practical Guidelines for ViT feature knowledge distillation
Zhendong Yang
Zhe Li
Ailing Zeng
Zexian Li
Chun Yuan
Yu Li
133
42
0
06 Sep 2022
Restoring Vision in Adverse Weather Conditions with Patch-Based Denoising Diffusion Models
Ozan Özdenizci
Robert Legenstein
DiffM
111
267
0
29 Jul 2022
Revisiting Label Smoothing and Knowledge Distillation Compatibility: What was Missing?
Keshigeyan Chandrasegaran
Ngoc-Trung Tran
Yunqing Zhao
Ngai-Man Cheung
145
44
0
29 Jun 2022
DPM-Solver: A Fast ODE Solver for Diffusion Probabilistic Model Sampling in Around 10 Steps
Cheng Lu
Yuhao Zhou
Fan Bao
Jianfei Chen
Chongxuan Li
Jun Zhu
DiffM
241
1,464
0
02 Jun 2022
Hierarchical Text-Conditional Image Generation with CLIP Latents
Aditya A. Ramesh
Prafulla Dhariwal
Alex Nichol
Casey Chu
Mark Chen
VLM
DiffM
425
6,921
0
13 Apr 2022
Decoupled Knowledge Distillation
Borui Zhao
Quan Cui
Renjie Song
Yiyu Qiu
Jiajun Liang
89
550
0
16 Mar 2022
Generative Adversarial Networks
Gilad Cohen
Raja Giryes
GAN
301
30,152
0
01 Mar 2022
RePaint: Inpainting using Denoising Diffusion Probabilistic Models
Andreas Lugmayr
Martin Danelljan
Andrés Romero
Feng Yu
Radu Timofte
Luc Van Gool
DiffM
357
1,427
0
24 Jan 2022
GLIDE: Towards Photorealistic Image Generation and Editing with Text-Guided Diffusion Models
Alex Nichol
Prafulla Dhariwal
Aditya A. Ramesh
Pranav Shyam
Pamela Mishkin
Bob McGrew
Ilya Sutskever
Mark Chen
381
3,630
0
20 Dec 2021
The Augmented Image Prior: Distilling 1000 Classes by Extrapolating from a Single Image
Yuki M. Asano
Aaqib Saeed
92
7
0
01 Dec 2021
Learning Sparse Masks for Diffusion-based Image Inpainting
Tobias Alt
Pascal Peter
Joachim Weickert
DiffM
49
13
0
06 Oct 2021
Cascaded Diffusion Models for High Fidelity Image Generation
Jonathan Ho
Chitwan Saharia
William Chan
David J. Fleet
Mohammad Norouzi
Tim Salimans
170
1,239
0
30 May 2021
Contrastive Model Inversion for Data-Free Knowledge Distillation
Gongfan Fang
Mingli Song
Xinchao Wang
Chen Shen
Xingen Wang
Xiuming Zhang
56
82
0
18 May 2021
Diffusion Models Beat GANs on Image Synthesis
Prafulla Dhariwal
Alex Nichol
310
7,971
0
11 May 2021
SRDiff: Single Image Super-Resolution with Diffusion Probabilistic Models
Haoying Li
Yifan Yang
Meng Chang
H. Feng
Zhi-hai Xu
Qi Li
Yue-ting Chen
DiffM
80
637
0
30 Apr 2021
Distilling Knowledge via Knowledge Review
Pengguang Chen
Shu Liu
Hengshuang Zhao
Jiaya Jia
197
447
0
19 Apr 2021
Zero-Shot Text-to-Image Generation
Aditya A. Ramesh
Mikhail Pavlov
Gabriel Goh
Scott Gray
Chelsea Voss
Alec Radford
Mark Chen
Ilya Sutskever
VLM
420
5,005
0
24 Feb 2021
Improved Denoising Diffusion Probabilistic Models
Alex Nichol
Prafulla Dhariwal
DiffM
354
3,728
0
18 Feb 2021
Large-Scale Generative Data-Free Distillation
Liangchen Luo
Mark Sandler
Zi Lin
A. Zhmoginov
Andrew G. Howard
73
43
0
10 Dec 2020
Score-Based Generative Modeling through Stochastic Differential Equations
Yang Song
Jascha Narain Sohl-Dickstein
Diederik P. Kingma
Abhishek Kumar
Stefano Ermon
Ben Poole
DiffM
SyDa
385
6,592
0
26 Nov 2020
Online Knowledge Distillation via Multi-branch Diversity Enhancement
Zheng Li
Ying Huang
Defang Chen
Tianren Luo
Ning Cai
Zhigeng Pan
63
28
0
02 Oct 2020
Data-Free Network Quantization With Adversarial Knowledge Distillation
Yoojin Choi
Jihwan P. Choi
Mostafa El-Khamy
Jungwon Lee
MQ
74
121
0
08 May 2020
Dreaming to Distill: Data-free Knowledge Transfer via DeepInversion
Hongxu Yin
Pavlo Molchanov
Zhizhong Li
J. Álvarez
Arun Mallya
Derek Hoiem
N. Jha
Jan Kautz
89
569
0
18 Dec 2019
The Knowledge Within: Methods for Data-Free Model Compression
Matan Haroush
Itay Hubara
Elad Hoffer
Daniel Soudry
62
109
0
03 Dec 2019
Online Knowledge Distillation with Diverse Peers
Defang Chen
Jian-Ping Mei
Can Wang
Yan Feng
Chun-Yen Chen
FedML
87
302
0
01 Dec 2019
Contrastive Representation Distillation
Yonglong Tian
Dilip Krishnan
Phillip Isola
181
1,054
0
23 Oct 2019
Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation
Linfeng Zhang
Jiebo Song
Anni Gao
Jingwei Chen
Chenglong Bao
Kaisheng Ma
FedML
76
865
0
17 May 2019
CutMix: Regularization Strategy to Train Strong Classifiers with Localizable Features
Sangdoo Yun
Dongyoon Han
Seong Joon Oh
Sanghyuk Chun
Junsuk Choe
Y. Yoo
OOD
629
4,814
0
13 May 2019
Structured Knowledge Distillation for Dense Prediction
Yifan Liu
Chris Liu
Jingdong Wang
Zhenbo Luo
104
585
0
11 Mar 2019
Fast Human Pose Estimation
Feng Zhang
Xiatian Zhu
Mao Ye
3DH
81
238
0
13 Nov 2018
ShuffleNet V2: Practical Guidelines for Efficient CNN Architecture Design
Ningning Ma
Xiangyu Zhang
Haitao Zheng
Jian Sun
183
5,019
0
30 Jul 2018
Data-Free Knowledge Distillation for Deep Neural Networks
Raphael Gontijo-Lopes
Stefano Fenu
Thad Starner
71
273
0
19 Oct 2017
Deep Mutual Learning
Ying Zhang
Tao Xiang
Timothy M. Hospedales
Huchuan Lu
FedML
155
1,656
0
01 Jun 2017
Paying More Attention to Attention: Improving the Performance of Convolutional Neural Networks via Attention Transfer
Sergey Zagoruyko
N. Komodakis
147
2,590
0
12 Dec 2016
Deep Unsupervised Learning using Nonequilibrium Thermodynamics
Jascha Narain Sohl-Dickstein
Eric A. Weiss
Niru Maheswaranathan
Surya Ganguli
SyDa
DiffM
312
7,035
0
12 Mar 2015
FitNets: Hints for Thin Deep Nets
Adriana Romero
Nicolas Ballas
Samira Ebrahimi Kahou
Antoine Chassang
C. Gatta
Yoshua Bengio
FedML
332
3,906
0
19 Dec 2014
1