ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2105.08584
  4. Cited By
Contrastive Model Inversion for Data-Free Knowledge Distillation

Contrastive Model Inversion for Data-Free Knowledge Distillation

18 May 2021
Gongfan Fang
Jie Song
Xinchao Wang
Chen Shen
Xingen Wang
Xiuming Zhang
ArXivPDFHTML

Papers citing "Contrastive Model Inversion for Data-Free Knowledge Distillation"

16 / 16 papers shown
Title
Simple Semi-supervised Knowledge Distillation from Vision-Language Models via $\mathbf{\texttt{D}}$ual-$\mathbf{\texttt{H}}$ead $\mathbf{\texttt{O}}$ptimization
Simple Semi-supervised Knowledge Distillation from Vision-Language Models via D\mathbf{\texttt{D}}Dual-H\mathbf{\texttt{H}}Head O\mathbf{\texttt{O}}Optimization
Seongjae Kang
Dong Bok Lee
Hyungjoon Jang
Sung Ju Hwang
VLM
57
0
0
12 May 2025
DKDM: Data-Free Knowledge Distillation for Diffusion Models with Any Architecture
DKDM: Data-Free Knowledge Distillation for Diffusion Models with Any Architecture
Qianlong Xiang
Miao Zhang
Yuzhang Shang
Jianlong Wu
Yan Yan
Liqiang Nie
DiffM
63
10
0
05 Sep 2024
Distilling Vision-Language Foundation Models: A Data-Free Approach via
  Prompt Diversification
Distilling Vision-Language Foundation Models: A Data-Free Approach via Prompt Diversification
Yunyi Xuan
Weijie Chen
Shicai Yang
Di Xie
Luojun Lin
Yueting Zhuang
VLM
34
4
0
21 Jul 2024
Data-free Knowledge Distillation for Fine-grained Visual Categorization
Data-free Knowledge Distillation for Fine-grained Visual Categorization
Renrong Shao
Wei Zhang
Jianhua Yin
Jun Wang
31
2
0
18 Apr 2024
Learning to Project for Cross-Task Knowledge Distillation
Learning to Project for Cross-Task Knowledge Distillation
Dylan Auty
Roy Miles
Benedikt Kolbeinsson
K. Mikolajczyk
40
0
0
21 Mar 2024
Towards Few-Call Model Stealing via Active Self-Paced Knowledge Distillation and Diffusion-Based Image Generation
Towards Few-Call Model Stealing via Active Self-Paced Knowledge Distillation and Diffusion-Based Image Generation
Vlad Hondru
Radu Tudor Ionescu
DiffM
47
1
0
29 Sep 2023
Sampling to Distill: Knowledge Transfer from Open-World Data
Sampling to Distill: Knowledge Transfer from Open-World Data
Yuzheng Wang
Zhaoyu Chen
Jie M. Zhang
Dingkang Yang
Zuhao Ge
Yang Liu
Siao Liu
Yunquan Sun
Wenqiang Zhang
Lizhe Qi
26
9
0
31 Jul 2023
Learning to Learn from APIs: Black-Box Data-Free Meta-Learning
Learning to Learn from APIs: Black-Box Data-Free Meta-Learning
Zixuan Hu
Li Shen
Zhenyi Wang
Baoyuan Wu
Chun Yuan
Dacheng Tao
47
7
0
28 May 2023
FoPro-KD: Fourier Prompted Effective Knowledge Distillation for
  Long-Tailed Medical Image Recognition
FoPro-KD: Fourier Prompted Effective Knowledge Distillation for Long-Tailed Medical Image Recognition
Marawan Elbatel
Robert Martí
Xiaomeng Li
AAML
34
10
0
27 May 2023
Is Synthetic Data From Diffusion Models Ready for Knowledge
  Distillation?
Is Synthetic Data From Diffusion Models Ready for Knowledge Distillation?
Zheng Li
Yuxuan Li
Penghai Zhao
Renjie Song
Xiang Li
Jian Yang
34
19
0
22 May 2023
Architecture, Dataset and Model-Scale Agnostic Data-free Meta-Learning
Architecture, Dataset and Model-Scale Agnostic Data-free Meta-Learning
Zixuan Hu
Li Shen
Zhenyi Wang
Tongliang Liu
Chun Yuan
Dacheng Tao
47
4
0
20 Mar 2023
Momentum Adversarial Distillation: Handling Large Distribution Shifts in
  Data-Free Knowledge Distillation
Momentum Adversarial Distillation: Handling Large Distribution Shifts in Data-Free Knowledge Distillation
Kien Do
Hung Le
D. Nguyen
Dang Nguyen
Haripriya Harikumar
T. Tran
Santu Rana
Svetha Venkatesh
18
32
0
21 Sep 2022
Dynamic Data-Free Knowledge Distillation by Easy-to-Hard Learning
  Strategy
Dynamic Data-Free Knowledge Distillation by Easy-to-Hard Learning Strategy
Jingru Li
Sheng Zhou
Liangcheng Li
Haishuai Wang
Zhi Yu
Jiajun Bu
34
14
0
29 Aug 2022
IDEAL: Query-Efficient Data-Free Learning from Black-box Models
IDEAL: Query-Efficient Data-Free Learning from Black-box Models
Jie M. Zhang
Chen Chen
Lingjuan Lyu
55
14
0
23 May 2022
Overcoming Catastrophic Forgetting in Graph Neural Networks
Overcoming Catastrophic Forgetting in Graph Neural Networks
Huihui Liu
Yiding Yang
Xinchao Wang
161
112
0
10 Dec 2020
Distilling Knowledge from Graph Convolutional Networks
Distilling Knowledge from Graph Convolutional Networks
Yiding Yang
Jiayan Qiu
Xiuming Zhang
Dacheng Tao
Xinchao Wang
160
226
0
23 Mar 2020
1