Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1912.08795
Cited By
v1
v2 (latest)
Dreaming to Distill: Data-free Knowledge Transfer via DeepInversion
18 December 2019
Hongxu Yin
Pavlo Molchanov
Zhizhong Li
J. Álvarez
Arun Mallya
Derek Hoiem
N. Jha
Jan Kautz
Re-assign community
ArXiv (abs)
PDF
HTML
Github (504★)
Papers citing
"Dreaming to Distill: Data-free Knowledge Transfer via DeepInversion"
4 / 354 papers shown
Title
Distill, Adapt, Distill: Training Small, In-Domain Models for Neural Machine Translation
Mitchell A. Gordon
Kevin Duh
CLL
VLM
84
13
0
05 Mar 2020
The Knowledge Within: Methods for Data-Free Model Compression
Matan Haroush
Itay Hubara
Elad Hoffer
Daniel Soudry
62
109
0
03 Dec 2019
Privacy-Preserving Blockchain-Based Federated Learning for IoT Devices
Yang Zhao
Jun Zhao
Linshan Jiang
Rui Tan
Dusit Niyato
Zengxiang Li
Lingjuan Lyu
Yingbo Liu
77
105
0
26 Jun 2019
ArcFace: Additive Angular Margin Loss for Deep Face Recognition
Jiankang Deng
Jiaxin Guo
J. Yang
Niannan Xue
I. Kotsia
Stefanos Zafeiriou
CVBM
94
221
0
23 Jan 2018
Previous
1
2
3
4
5
6
7
8