ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2006.07114
  4. Cited By
Knowledge Distillation Meets Self-Supervision

Knowledge Distillation Meets Self-Supervision

12 June 2020
Guodong Xu
Ziwei Liu
Xiaoxiao Li
Chen Change Loy
    FedML
ArXivPDFHTML

Papers citing "Knowledge Distillation Meets Self-Supervision"

45 / 145 papers shown
Title
Semi-Online Knowledge Distillation
Semi-Online Knowledge Distillation
Zhiqiang Liu
Yanxia Liu
Chengkai Huang
19
5
0
23 Nov 2021
GenURL: A General Framework for Unsupervised Representation Learning
GenURL: A General Framework for Unsupervised Representation Learning
Siyuan Li
Zicheng Liu
Z. Zang
Di Wu
Zhiyuan Chen
Stan Z. Li
OOD
3DGS
OffRL
34
9
0
27 Oct 2021
Mosaicking to Distill: Knowledge Distillation from Out-of-Domain Data
Mosaicking to Distill: Knowledge Distillation from Out-of-Domain Data
Gongfan Fang
Yifan Bao
Jie Song
Xinchao Wang
Don Xie
Chengchao Shen
Xiuming Zhang
43
44
0
27 Oct 2021
Learning Proposals for Practical Energy-Based Regression
Learning Proposals for Practical Energy-Based Regression
L. Kumar
Martin Danelljan
Thomas B. Schon
30
4
0
22 Oct 2021
Learning Rich Nearest Neighbor Representations from Self-supervised
  Ensembles
Learning Rich Nearest Neighbor Representations from Self-supervised Ensembles
Bram Wallace
Devansh Arpit
Huan Wang
Caiming Xiong
SSL
OOD
30
0
0
19 Oct 2021
Mask or Non-Mask? Robust Face Mask Detector via Triplet-Consistency
  Representation Learning
Mask or Non-Mask? Robust Face Mask Detector via Triplet-Consistency Representation Learning
Chunxi Yang
Thanh Hai Phung
Hong-Han Shuai
Wen-Huang Cheng
CVBM
27
14
0
01 Oct 2021
Self Supervision to Distillation for Long-Tailed Visual Recognition
Self Supervision to Distillation for Long-Tailed Visual Recognition
Tianhao Li
Limin Wang
Gangshan Wu
45
101
0
09 Sep 2021
Knowledge Distillation Using Hierarchical Self-Supervision Augmented
  Distribution
Knowledge Distillation Using Hierarchical Self-Supervision Augmented Distribution
Chuanguang Yang
Zhulin An
Linhang Cai
Yongjun Xu
22
15
0
07 Sep 2021
Dual Transfer Learning for Event-based End-task Prediction via Pluggable
  Event to Image Translation
Dual Transfer Learning for Event-based End-task Prediction via Pluggable Event to Image Translation
Lin Wang
Yujeong Chae
Kuk-Jin Yoon
27
32
0
04 Sep 2021
Self-Regulation for Semantic Segmentation
Self-Regulation for Semantic Segmentation
Zhangfu Dong
Zhang Hanwang
T. Jinhui
Huang Xiansheng
Sun Qianru
36
35
0
22 Aug 2021
Distilling Holistic Knowledge with Graph Neural Networks
Distilling Holistic Knowledge with Graph Neural Networks
Sheng Zhou
Yucheng Wang
Defang Chen
Jiawei Chen
Xin Wang
Can Wang
Jiajun Bu
15
54
0
12 Aug 2021
Neighborhood Consensus Contrastive Learning for Backward-Compatible
  Representation
Neighborhood Consensus Contrastive Learning for Backward-Compatible Representation
Shengsen Wu
Liang Chen
Yihang Lou
Yan Bai
Tao Bai
Minghua Deng
Ling-yu Duan
24
8
0
07 Aug 2021
Linking Common Vulnerabilities and Exposures to the MITRE ATT&CK
  Framework: A Self-Distillation Approach
Linking Common Vulnerabilities and Exposures to the MITRE ATT&CK Framework: A Self-Distillation Approach
Benjamin Ampel
Sagar Samtani
Steven Ullman
Hsinchun Chen
25
35
0
03 Aug 2021
Hierarchical Self-supervised Augmented Knowledge Distillation
Hierarchical Self-supervised Augmented Knowledge Distillation
Chuanguang Yang
Zhulin An
Linhang Cai
Yongjun Xu
SSL
29
76
0
29 Jul 2021
Continual Contrastive Learning for Image Classification
Continual Contrastive Learning for Image Classification
Zhiwei Lin
Yongtao Wang
Hongxiang Lin
SSL
CLL
24
13
0
05 Jul 2021
Bag of Instances Aggregation Boosts Self-supervised Distillation
Bag of Instances Aggregation Boosts Self-supervised Distillation
Haohang Xu
Jiemin Fang
Xiaopeng Zhang
Lingxi Xie
Xinggang Wang
Wenrui Dai
H. Xiong
Qi Tian
SSL
28
21
0
04 Jul 2021
Few-Shot Learning with a Strong Teacher
Few-Shot Learning with a Strong Teacher
Han-Jia Ye
Lu Ming
De-Chuan Zhan
Wei-Lun Chao
19
49
0
01 Jul 2021
Teacher's pet: understanding and mitigating biases in distillation
Teacher's pet: understanding and mitigating biases in distillation
Michal Lukasik
Srinadh Bhojanapalli
A. Menon
Sanjiv Kumar
18
25
0
19 Jun 2021
Zero-Shot Knowledge Distillation from a Decision-Based Black-Box Model
Zero-Shot Knowledge Distillation from a Decision-Based Black-Box Model
Z. Wang
11
43
0
07 Jun 2021
Towards Compact Single Image Super-Resolution via Contrastive
  Self-distillation
Towards Compact Single Image Super-Resolution via Contrastive Self-distillation
Yanbo Wang
Shaohui Lin
Yanyun Qu
Haiyan Wu
Zhizhong Zhang
Yuan Xie
Angela Yao
SupR
22
53
0
25 May 2021
DisCo: Remedy Self-supervised Learning on Lightweight Models with
  Distilled Contrastive Learning
DisCo: Remedy Self-supervised Learning on Lightweight Models with Distilled Contrastive Learning
Yuting Gao
Jia-Xin Zhuang
Xiaowei Guo
Hao Cheng
Xing Sun
Ke Li
Feiyue Huang
39
40
0
19 Apr 2021
Application of Computer Vision and Machine Learning for Digitized
  Herbarium Specimens: A Systematic Literature Review
Application of Computer Vision and Machine Learning for Digitized Herbarium Specimens: A Systematic Literature Review
Burhan Rashid Hussein
O. A. Malik
Wee-Hong Ong
Johan Willem Frederik Slik
10
4
0
18 Apr 2021
Learn Goal-Conditioned Policy with Intrinsic Motivation for Deep
  Reinforcement Learning
Learn Goal-Conditioned Policy with Intrinsic Motivation for Deep Reinforcement Learning
Jinxin Liu
Donglin Wang
Qiangxing Tian
Zhengyu Chen
27
23
0
11 Apr 2021
Distilling and Transferring Knowledge via cGAN-generated Samples for
  Image Classification and Regression
Distilling and Transferring Knowledge via cGAN-generated Samples for Image Classification and Regression
Xin Ding
Z. J. Wang
Zuheng Xu
Z. Jane Wang
William J. Welch
30
22
0
07 Apr 2021
Complementary Relation Contrastive Distillation
Complementary Relation Contrastive Distillation
Jinguo Zhu
Shixiang Tang
Dapeng Chen
Shijie Yu
Yakun Liu
A. Yang
M. Rong
Xiaohua Wang
24
77
0
29 Mar 2021
Distilling a Powerful Student Model via Online Knowledge Distillation
Distilling a Powerful Student Model via Online Knowledge Distillation
Shaojie Li
Mingbao Lin
Yan Wang
Yongjian Wu
Yonghong Tian
Ling Shao
Rongrong Ji
FedML
27
46
0
26 Mar 2021
Learning Student-Friendly Teacher Networks for Knowledge Distillation
Learning Student-Friendly Teacher Networks for Knowledge Distillation
D. Park
Moonsu Cha
C. Jeong
Daesin Kim
Bohyung Han
121
100
0
12 Feb 2021
IC Networks: Remodeling the Basic Unit for Convolutional Neural Networks
IC Networks: Remodeling the Basic Unit for Convolutional Neural Networks
Junyi An
Fengshan Liu
Jian Zhao
S. Furao
8
1
0
06 Feb 2021
Knowledge Distillation in Iterative Generative Models for Improved
  Sampling Speed
Knowledge Distillation in Iterative Generative Models for Improved Sampling Speed
Eric Luhman
Troy Luhman
DiffM
195
258
0
07 Jan 2021
Computation-Efficient Knowledge Distillation via Uncertainty-Aware Mixup
Computation-Efficient Knowledge Distillation via Uncertainty-Aware Mixup
Guodong Xu
Ziwei Liu
Chen Change Loy
UQCV
21
39
0
17 Dec 2020
Cross-Layer Distillation with Semantic Calibration
Cross-Layer Distillation with Semantic Calibration
Defang Chen
Jian-Ping Mei
Yuan Zhang
Can Wang
Yan Feng
Chun-Yen Chen
FedML
45
286
0
06 Dec 2020
Multi-level Knowledge Distillation via Knowledge Alignment and
  Correlation
Multi-level Knowledge Distillation via Knowledge Alignment and Correlation
Fei Ding
Yin Yang
Hongxin Hu
V. Krovi
Feng Luo
16
4
0
01 Dec 2020
torchdistill: A Modular, Configuration-Driven Framework for Knowledge
  Distillation
torchdistill: A Modular, Configuration-Driven Framework for Knowledge Distillation
Yoshitomo Matsubara
11
25
0
25 Nov 2020
SLADE: A Self-Training Framework For Distance Metric Learning
SLADE: A Self-Training Framework For Distance Metric Learning
Jiali Duan
Yen-Liang Lin
Son N. Tran
Larry S. Davis
C.-C. Jay Kuo
19
11
0
20 Nov 2020
Distilling Knowledge by Mimicking Features
Distilling Knowledge by Mimicking Features
G. Wang
Yifan Ge
Jianxin Wu
17
33
0
03 Nov 2020
Ferrograph image classification
Ferrograph image classification
Peng Peng
Jiugen Wang
9
1
0
14 Oct 2020
Locally Linear Region Knowledge Distillation
Locally Linear Region Knowledge Distillation
Xiang Deng
Zhongfei Zhang
Zhang
17
0
0
09 Oct 2020
Densely Guided Knowledge Distillation using Multiple Teacher Assistants
Densely Guided Knowledge Distillation using Multiple Teacher Assistants
Wonchul Son
Jaemin Na
Junyong Choi
Wonjun Hwang
20
110
0
18 Sep 2020
SSKD: Self-Supervised Knowledge Distillation for Cross Domain Adaptive
  Person Re-Identification
SSKD: Self-Supervised Knowledge Distillation for Cross Domain Adaptive Person Re-Identification
Junhui Yin
Jiayan Qiu
Siqing Zhang
Zhanyu Ma
Jun Guo
16
5
0
13 Sep 2020
A Unified Framework for Shot Type Classification Based on Subject
  Centric Lens
A Unified Framework for Shot Type Classification Based on Subject Centric Lens
Anyi Rao
Jiaze Wang
Linning Xu
Xuekun Jiang
Qingqiu Huang
Bolei Zhou
Dahua Lin
18
60
0
08 Aug 2020
Self-supervised Knowledge Distillation for Few-shot Learning
Self-supervised Knowledge Distillation for Few-shot Learning
Jathushan Rajasegaran
Salman Khan
Munawar Hayat
F. Khan
M. Shah
SSL
31
91
0
17 Jun 2020
Knowledge Distillation: A Survey
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
19
2,843
0
09 Jun 2020
ResKD: Residual-Guided Knowledge Distillation
ResKD: Residual-Guided Knowledge Distillation
Xuewei Li
Songyuan Li
Bourahla Omar
Fei Wu
Xi Li
21
47
0
08 Jun 2020
Knowledge Distillation by On-the-Fly Native Ensemble
Knowledge Distillation by On-the-Fly Native Ensemble
Xu Lan
Xiatian Zhu
S. Gong
197
473
0
12 Jun 2018
Boosting Self-Supervised Learning via Knowledge Transfer
Boosting Self-Supervised Learning via Knowledge Transfer
M. Noroozi
Ananth Vinjimoor
Paolo Favaro
Hamed Pirsiavash
SSL
209
292
0
01 May 2018
Previous
123