Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2003.09088
Cited By
Data-Free Knowledge Amalgamation via Group-Stack Dual-GAN
20 March 2020
Jingwen Ye
Yixin Ji
Xinchao Wang
Xin Gao
Mingli Song
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Data-Free Knowledge Amalgamation via Group-Stack Dual-GAN"
9 / 9 papers shown
Title
Sampling to Distill: Knowledge Transfer from Open-World Data
Yuzheng Wang
Zhaoyu Chen
Jie M. Zhang
Dingkang Yang
Zuhao Ge
Yang Liu
Siao Liu
Yunquan Sun
Wenqiang Zhang
Lizhe Qi
26
9
0
31 Jul 2023
FoPro-KD: Fourier Prompted Effective Knowledge Distillation for Long-Tailed Medical Image Recognition
Marawan Elbatel
Robert Martí
Xiaomeng Li
AAML
34
10
0
27 May 2023
TinyMIM: An Empirical Study of Distilling MIM Pre-trained Models
Sucheng Ren
Fangyun Wei
Zheng-Wei Zhang
Han Hu
40
34
0
03 Jan 2023
IDEAL: Query-Efficient Data-Free Learning from Black-box Models
Jie M. Zhang
Chen Chen
Lingjuan Lyu
55
14
0
23 May 2022
DearKD: Data-Efficient Early Knowledge Distillation for Vision Transformers
Xianing Chen
Qiong Cao
Yujie Zhong
Jing Zhang
Shenghua Gao
Dacheng Tao
ViT
32
76
0
27 Apr 2022
The Augmented Image Prior: Distilling 1000 Classes by Extrapolating from a Single Image
Yuki M. Asano
Aaqib Saeed
37
7
0
01 Dec 2021
Training Generative Adversarial Networks in One Stage
Chengchao Shen
Youtan Yin
Xinchao Wang
Xubin Li
Jie Song
Mingli Song
GAN
23
14
0
28 Feb 2021
Learning Propagation Rules for Attribution Map Generation
Yiding Yang
Jiayan Qiu
Mingli Song
Dacheng Tao
Xinchao Wang
FAtt
38
17
0
14 Oct 2020
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
19
2,837
0
09 Jun 2020
1