Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1811.05072
Cited By
Private Model Compression via Knowledge Distillation
13 November 2018
Ji Wang
Weidong Bao
Lichao Sun
Xiaomin Zhu
Bokai Cao
Philip S. Yu
FedML
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Private Model Compression via Knowledge Distillation"
20 / 20 papers shown
Title
Graph-based Knowledge Distillation: A survey and experimental evaluation
Jing Liu
Tongya Zheng
Guanzheng Zhang
Qinfen Hao
35
8
0
27 Feb 2023
Swing Distillation: A Privacy-Preserving Knowledge Distillation Framework
Junzhuo Li
Xinwei Wu
Weilong Dong
Shuangzhi Wu
Chao Bian
Deyi Xiong
31
3
0
16 Dec 2022
FS-BAN: Born-Again Networks for Domain Generalization Few-Shot Classification
Yunqing Zhao
Ngai-man Cheung
BDL
25
12
0
23 Aug 2022
Stacked Hybrid-Attention and Group Collaborative Learning for Unbiased Scene Graph Generation
Xingning Dong
Tian Gan
Xuemeng Song
Jianlong Wu
Yuan Cheng
Liqiang Nie
24
92
0
18 Mar 2022
MobileFaceSwap: A Lightweight Framework for Video Face Swapping
Zhi-liang Xu
Zhibin Hong
Changxing Ding
Zhen Zhu
Junyu Han
Jingtuo Liu
Errui Ding
CVBM
27
48
0
11 Jan 2022
Unsupervised Domain Adaptive Person Re-Identification via Human Learning Imitation
Yang Peng
Ping Liu
Yawei Luo
Pan Zhou
Zichuan Xu
Jingen Liu
OOD
23
0
0
28 Nov 2021
How and When Adversarial Robustness Transfers in Knowledge Distillation?
Rulin Shao
Ming Zhou
C. Bezemer
Cho-Jui Hsieh
AAML
32
17
0
22 Oct 2021
Privacy-Preserving Machine Learning: Methods, Challenges and Directions
Runhua Xu
Nathalie Baracaldo
J. Joshi
32
100
0
10 Aug 2021
Honest-but-Curious Nets: Sensitive Attributes of Private Inputs Can Be Secretly Coded into the Classifiers' Outputs
Mohammad Malekzadeh
Anastasia Borovykh
Deniz Gündüz
MIACV
26
42
0
25 May 2021
Knowledge Distillation as Semiparametric Inference
Tri Dao
G. Kamath
Vasilis Syrgkanis
Lester W. Mackey
40
31
0
20 Apr 2021
Compact CNN Structure Learning by Knowledge Distillation
Waqar Ahmed
Andrea Zunino
Pietro Morerio
Vittorio Murino
38
5
0
19 Apr 2021
An Information-Theoretic Justification for Model Pruning
Berivan Isik
Tsachy Weissman
Albert No
95
35
0
16 Feb 2021
Federated Model Distillation with Noise-Free Differential Privacy
Lichao Sun
Lingjuan Lyu
FedML
31
106
0
11 Sep 2020
Communication-Efficient and Distributed Learning Over Wireless Networks: Principles and Applications
Jihong Park
S. Samarakoon
Anis Elgabli
Joongheon Kim
M. Bennis
Seong-Lyun Kim
Mérouane Debbah
39
161
0
06 Aug 2020
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
23
2,857
0
09 Jun 2020
BiDet: An Efficient Binarized Object Detector
Ziwei Wang
Ziyi Wu
Jiwen Lu
Jie Zhou
MQ
62
64
0
09 Mar 2020
Reviewing and Improving the Gaussian Mechanism for Differential Privacy
Jun Zhao
Teng Wang
Tao Bai
Kwok-Yan Lam
Zhiying Xu
Shuyu Shi
Xuebin Ren
Xinyu Yang
Yang Liu
Han Yu
44
30
0
27 Nov 2019
Membership Privacy for Machine Learning Models Through Knowledge Transfer
Virat Shejwalkar
Amir Houmansadr
22
10
0
15 Jun 2019
Divide and Conquer: Leveraging Intermediate Feature Representations for Quantized Training of Neural Networks
Ahmed T. Elthakeb
Prannoy Pilligundla
Alex Cloninger
H. Esmaeilzadeh
MQ
26
8
0
14 Jun 2019
Copying Machine Learning Classifiers
Irene Unceta
Jordi Nin
O. Pujol
14
18
0
05 Mar 2019
1