ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1912.08795
  4. Cited By
Dreaming to Distill: Data-free Knowledge Transfer via DeepInversion
v1v2 (latest)

Dreaming to Distill: Data-free Knowledge Transfer via DeepInversion

18 December 2019
Hongxu Yin
Pavlo Molchanov
Zhizhong Li
J. Álvarez
Arun Mallya
Derek Hoiem
N. Jha
Jan Kautz
ArXiv (abs)PDFHTMLGithub (504★)

Papers citing "Dreaming to Distill: Data-free Knowledge Transfer via DeepInversion"

4 / 354 papers shown
Title
Distill, Adapt, Distill: Training Small, In-Domain Models for Neural
  Machine Translation
Distill, Adapt, Distill: Training Small, In-Domain Models for Neural Machine Translation
Mitchell A. Gordon
Kevin Duh
CLLVLM
84
13
0
05 Mar 2020
The Knowledge Within: Methods for Data-Free Model Compression
The Knowledge Within: Methods for Data-Free Model Compression
Matan Haroush
Itay Hubara
Elad Hoffer
Daniel Soudry
62
109
0
03 Dec 2019
Privacy-Preserving Blockchain-Based Federated Learning for IoT Devices
Privacy-Preserving Blockchain-Based Federated Learning for IoT Devices
Yang Zhao
Jun Zhao
Linshan Jiang
Rui Tan
Dusit Niyato
Zengxiang Li
Lingjuan Lyu
Yingbo Liu
77
105
0
26 Jun 2019
ArcFace: Additive Angular Margin Loss for Deep Face Recognition
ArcFace: Additive Angular Margin Loss for Deep Face Recognition
Jiankang Deng
Jiaxin Guo
J. Yang
Niannan Xue
I. Kotsia
Stefanos Zafeiriou
CVBM
94
221
0
23 Jan 2018
Previous
12345678