Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2110.12072
Cited By
How and When Adversarial Robustness Transfers in Knowledge Distillation?
22 October 2021
Rulin Shao
Ming Zhou
C. Bezemer
Cho-Jui Hsieh
AAML
Re-assign community
ArXiv
PDF
HTML
Papers citing
"How and When Adversarial Robustness Transfers in Knowledge Distillation?"
4 / 4 papers shown
Title
Releasing Inequality Phenomena in
L
∞
L_{\infty}
L
∞
-Adversarial Training via Input Gradient Distillation
Junxi Chen
Junhao Dong
Xiaohua Xie
AAML
20
0
0
16 May 2023
Maximum Likelihood Distillation for Robust Modulation Classification
Javier Maroto
Gérôme Bovet
P. Frossard
AAML
13
5
0
01 Nov 2022
Squeeze Training for Adversarial Robustness
Qizhang Li
Yiwen Guo
W. Zuo
Hao Chen
OOD
39
9
0
23 May 2022
Intriguing Properties of Vision Transformers
Muzammal Naseer
Kanchana Ranasinghe
Salman Khan
Munawar Hayat
Fahad Shahbaz Khan
Ming-Hsuan Yang
ViT
265
621
0
21 May 2021
1