Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2006.07114
Cited By
Knowledge Distillation Meets Self-Supervision
12 June 2020
Guodong Xu
Ziwei Liu
Xiaoxiao Li
Chen Change Loy
FedML
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Knowledge Distillation Meets Self-Supervision"
50 / 145 papers shown
Title
Head3D: Complete 3D Head Generation via Tri-plane Feature Distillation
Y. Cheng
Yichao Yan
Wenhan Zhu
Ye Pan
Bowen Pan
Xiaokang Yang
3DH
37
3
0
28 Mar 2023
MobileVOS: Real-Time Video Object Segmentation Contrastive Learning meets Knowledge Distillation
Roy Miles
M. K. Yucel
Bruno Manganelli
Albert Saà-Garriga
VOS
38
24
0
14 Mar 2023
Multitask Prompt Tuning Enables Parameter-Efficient Transfer Learning
Zhen Wang
Rameswar Panda
Leonid Karlinsky
Rogerio Feris
Huan Sun
Yoon Kim
VLM
VPVLM
33
107
0
06 Mar 2023
Distilling Calibrated Student from an Uncalibrated Teacher
Ishan Mishra
Sethu Vamsi Krishna
Deepak Mishra
FedML
37
2
0
22 Feb 2023
Knowledge Distillation in Vision Transformers: A Critical Review
Gousia Habib
Tausifa Jan Saleem
Brejesh Lall
29
15
0
04 Feb 2023
Rethinking Soft Label in Label Distribution Learning Perspective
Seungbum Hong
Jihun Yoon
Bogyu Park
Min-Kook Choi
31
0
0
31 Jan 2023
Unifying Synergies between Self-supervised Learning and Dynamic Computation
Tarun Krishna
Ayush K. Rai
Alexandru Drimbarean
Eric Arazo
Paul Albert
Alan F. Smeaton
Kevin McGuinness
Noel E. O'Connor
24
0
0
22 Jan 2023
A Survey on Self-supervised Learning: Algorithms, Applications, and Future Trends
Jie Gui
Tuo Chen
Jing Zhang
Qiong Cao
Zhe Sun
Haoran Luo
Dacheng Tao
31
124
0
13 Jan 2023
Masked Video Distillation: Rethinking Masked Feature Modeling for Self-supervised Video Representation Learning
Rui Wang
Dongdong Chen
Zuxuan Wu
Yinpeng Chen
Xiyang Dai
Mengchen Liu
Lu Yuan
Yu-Gang Jiang
VGen
32
87
0
08 Dec 2022
Occlusion-Robust FAU Recognition by Mining Latent Space of Masked Autoencoders
Minyang Jiang
Yongwei Wang
Martin J. McKeown
Jane Wang
CVBM
20
2
0
08 Dec 2022
Hilbert Distillation for Cross-Dimensionality Networks
Dian Qin
Haishuai Wang
Zhe Liu
Hongjia Xu
Sheng Zhou
Jiajun Bu
23
4
0
08 Nov 2022
Teacher-Student Architecture for Knowledge Learning: A Survey
Chengming Hu
Xuan Li
Dan Liu
Xi Chen
Ju Wang
Xue Liu
20
35
0
28 Oct 2022
Online Cross-Layer Knowledge Distillation on Graph Neural Networks with Deep Supervision
Jiongyu Guo
Defang Chen
Can Wang
22
3
0
25 Oct 2022
Respecting Transfer Gap in Knowledge Distillation
Yulei Niu
Long Chen
Chan Zhou
Hanwang Zhang
26
23
0
23 Oct 2022
Deep Active Ensemble Sampling For Image Classification
S. Mohamadi
Gianfranco Doretto
Donald Adjeroh
UQCV
18
9
0
11 Oct 2022
Towards a Unified View of Affinity-Based Knowledge Distillation
Vladimir Li
A. Maki
6
0
0
30 Sep 2022
Music Source Separation with Band-split RNN
Yi Luo
Jianwei Yu
57
107
0
30 Sep 2022
On-Device Domain Generalization
Kaiyang Zhou
Yuanhan Zhang
Yuhang Zang
Jingkang Yang
Chen Change Loy
Ziwei Liu
OOD
33
6
0
15 Sep 2022
Switchable Online Knowledge Distillation
Biao Qian
Yang Wang
Hongzhi Yin
Richang Hong
Meng Wang
58
38
0
12 Sep 2022
PANDA: Prompt Transfer Meets Knowledge Distillation for Efficient Model Adaptation
Qihuang Zhong
Liang Ding
Juhua Liu
Bo Du
Dacheng Tao
VLM
CLL
32
41
0
22 Aug 2022
GCISG: Guided Causal Invariant Learning for Improved Syn-to-real Generalization
Gilhyun Nam
Gyeongjae Choi
Kyungmin Lee
OOD
18
4
0
22 Aug 2022
Mind the Gap in Distilling StyleGANs
Guodong Xu
Yuenan Hou
Ziwei Liu
Chen Change Loy
GAN
35
13
0
18 Aug 2022
Online Knowledge Distillation via Mutual Contrastive Learning for Visual Recognition
Chuanguang Yang
Zhulin An
Helong Zhou
Fuzhen Zhuang
Yongjun Xu
Qian Zhang
39
50
0
23 Jul 2022
DSPNet: Towards Slimmable Pretrained Networks based on Discriminative Self-supervised Learning
Shaoru Wang
Zeming Li
Jin Gao
Liang Li
Weiming Hu
41
0
0
13 Jul 2022
Knowledge Condensation Distillation
Chenxin Li
Mingbao Lin
Zhiyuan Ding
Nie Lin
Yihong Zhuang
Yue Huang
Xinghao Ding
Liujuan Cao
31
28
0
12 Jul 2022
Contrastive Deep Supervision
Linfeng Zhang
Xin Chen
Junbo Zhang
Runpei Dong
Kaisheng Ma
72
28
0
12 Jul 2022
ProSelfLC: Progressive Self Label Correction Towards A Low-Temperature Entropy State
Xinshao Wang
Yang Hua
Elyor Kodirov
S. Mukherjee
David A. Clifton
N. Robertson
19
6
0
30 Jun 2022
Mixed Sample Augmentation for Online Distillation
Yiqing Shen
Liwu Xu
Yuzhe Yang
Yaqian Li
Yandong Guo
23
3
0
24 Jun 2022
Toward Student-Oriented Teacher Network Training For Knowledge Distillation
Chengyu Dong
Liyuan Liu
Jingbo Shang
40
6
0
14 Jun 2022
ORC: Network Group-based Knowledge Distillation using Online Role Change
Jun-woo Choi
Hyeon Cho
Seockhwa Jeong
Wonjun Hwang
11
3
0
01 Jun 2022
Knowledge Distillation Meets Open-Set Semi-Supervised Learning
Jing Yang
Xiatian Zhu
Adrian Bulat
Brais Martínez
Georgios Tzimiropoulos
31
7
0
13 May 2022
Generalized Knowledge Distillation via Relationship Matching
Han-Jia Ye
Su Lu
De-Chuan Zhan
FedML
22
20
0
04 May 2022
Proto2Proto: Can you recognize the car, the way I do?
Monish Keswani
Sriranjani Ramakrishnan
Nishant Reddy
V. Balasubramanian
8
26
0
25 Apr 2022
Selective Cross-Task Distillation
Su Lu
Han-Jia Ye
De-Chuan Zhan
28
0
0
25 Apr 2022
DistPro: Searching A Fast Knowledge Distillation Process via Meta Optimization
XueQing Deng
Dawei Sun
Shawn D. Newsam
Peng Wang
16
8
0
12 Apr 2022
CD
2
^2
2
-pFed: Cyclic Distillation-guided Channel Decoupling for Model Personalization in Federated Learning
Yiqing Shen
Yuyin Zhou
Lequan Yu
OOD
24
55
0
08 Apr 2022
Non-Local Latent Relation Distillation for Self-Adaptive 3D Human Pose Estimation
Jogendra Nath Kundu
Siddharth Seth
Anirudh Gururaj Jamkhandi
Pradyumna
Varun Jampani
Anirban Chakraborty
R. Venkatesh Babu
3DH
18
9
0
05 Apr 2022
Self-Distillation from the Last Mini-Batch for Consistency Regularization
Yiqing Shen
Liwu Xu
Yuzhe Yang
Yaqian Li
Yandong Guo
15
60
0
30 Mar 2022
Knowledge Distillation with the Reused Teacher Classifier
Defang Chen
Jianhan Mei
Hailin Zhang
C. Wang
Yan Feng
Chun-Yen Chen
36
166
0
26 Mar 2022
SSD-KD: A Self-supervised Diverse Knowledge Distillation Method for Lightweight Skin Lesion Classification Using Dermoscopic Images
Yongwei Wang
Yuheng Wang
Tim K. Lee
C. Miao
Z. J. Wang
21
74
0
22 Mar 2022
Exploring Patch-wise Semantic Relation for Contrastive Learning in Image-to-Image Translation Tasks
Chanyong Jung
Gihyun Kwon
Jong Chul Ye
29
84
0
03 Mar 2022
Distillation with Contrast is All You Need for Self-Supervised Point Cloud Representation Learning
Kexue Fu
Peng Gao
Renrui Zhang
Hongsheng Li
Yu Qiao
Manning Wang
SSL
3DPC
22
23
0
09 Feb 2022
Adaptive Mixing of Auxiliary Losses in Supervised Learning
D. Sivasubramanian
Ayush Maheshwari
Pradeep Shenoy
A. Prathosh
Ganesh Ramakrishnan
29
5
0
07 Feb 2022
Iterative Self Knowledge Distillation -- From Pothole Classification to Fine-Grained and COVID Recognition
Kuan-Chuan Peng
41
2
0
04 Feb 2022
Multi-Modality Distillation via Learning the teacher's modality-level Gram Matrix
Peng Liu
10
0
0
21 Dec 2021
Temporal Transformer Networks with Self-Supervision for Action Recognition
Yongkang Zhang
Jun Li
Guoming Wu
Hanjie Zhang
Zhiping Shi
Zhaoxun Liu
Zizhang Wu
ViT
27
4
0
14 Dec 2021
Auxiliary Learning for Self-Supervised Video Representation via Similarity-based Knowledge Distillation
Amirhossein Dadashzadeh
Alan Whone
Majid Mirmehdi
SSL
21
4
0
07 Dec 2021
Information Theoretic Representation Distillation
Roy Miles
Adrian Lopez-Rodriguez
K. Mikolajczyk
MQ
13
21
0
01 Dec 2021
Unsupervised Domain Adaptive Person Re-Identification via Human Learning Imitation
Yang Peng
Ping Liu
Yawei Luo
Pan Zhou
Zichuan Xu
Jingen Liu
OOD
23
0
0
28 Nov 2021
EvDistill: Asynchronous Events to End-task Learning via Bidirectional Reconstruction-guided Cross-modal Knowledge Distillation
Lin Wang
Yujeong Chae
Sung-Hoon Yoon
Tae-Kyun Kim
Kuk-Jin Yoon
33
64
0
24 Nov 2021
Previous
1
2
3
Next