Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2006.07114
Cited By
Knowledge Distillation Meets Self-Supervision
12 June 2020
Guodong Xu
Ziwei Liu
Xiaoxiao Li
Chen Change Loy
FedML
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Knowledge Distillation Meets Self-Supervision"
50 / 145 papers shown
Title
MoKD: Multi-Task Optimization for Knowledge Distillation
Zeeshan Hayder
A. Cheraghian
Lars Petersson
Mehrtash Harandi
VLM
54
0
0
13 May 2025
Cross-View Consistency Regularisation for Knowledge Distillation
W. Zhang
Dongnan Liu
Weidong Cai
Chao Ma
73
1
0
21 Dec 2024
Preview-based Category Contrastive Learning for Knowledge Distillation
Muhe Ding
Jianlong Wu
Xue Dong
Xiaojie Li
Pengda Qin
Tian Gan
Liqiang Nie
VLM
39
0
0
18 Oct 2024
Local-to-Global Self-Supervised Representation Learning for Diabetic Retinopathy Grading
Mostafa Hajighasemloua
Samad Sheikhaei
Hamid Soltanian-Zadeha
18
0
0
01 Oct 2024
Simple Unsupervised Knowledge Distillation With Space Similarity
Aditya Singh
Haohan Wang
31
1
0
20 Sep 2024
Applications of Knowledge Distillation in Remote Sensing: A Survey
Yassine Himeur
N. Aburaed
O. Elharrouss
Iraklis Varlamis
Shadi Atalla
W. Mansoor
Hussain Al Ahmad
45
4
0
18 Sep 2024
Low-Resolution Object Recognition with Cross-Resolution Relational Contrastive Distillation
Kangkai Zhang
Shiming Ge
Ruixin Shi
Dan Zeng
54
13
0
04 Sep 2024
PRG: Prompt-Based Distillation Without Annotation via Proxy Relational Graph
Yijin Xu
Jialun Liu
Hualiang Wei
Wenhui Li
38
0
0
22 Aug 2024
OVOSE: Open-Vocabulary Semantic Segmentation in Event-Based Cameras
Muhammad Rameez Ur Rahman
Jhony H. Giraldo
Indro Spinelli
Stéphane Lathuilière
Fabio Galasso
VLM
28
0
0
18 Aug 2024
Improving Zero-shot Generalization of Learned Prompts via Unsupervised Knowledge Distillation
Marco Mistretta
Alberto Baldrati
Marco Bertini
Andrew D. Bagdanov
VPVLM
VLM
35
6
0
03 Jul 2024
Mixing Natural and Synthetic Images for Robust Self-Supervised Representations
Reza Akbarian Bafghi
Nidhin Harilal
C. Monteleoni
M. Raissi
DiffM
38
0
0
18 Jun 2024
Adaptive Teaching with Shared Classifier for Knowledge Distillation
Jaeyeon Jang
Young-Ik Kim
Jisu Lim
Hyeonseong Lee
21
0
0
12 Jun 2024
OmniBind: Teach to Build Unequal-Scale Modality Interaction for Omni-Bind of All
Yuanhuiyi Lyu
Xueye Zheng
Dahun Kim
Lin Wang
51
13
0
25 May 2024
Retro: Reusing teacher projection head for efficient embedding distillation on Lightweight Models via Self-supervised Learning
Khanh-Binh Nguyen
Chae Jung Park
34
0
0
24 May 2024
Cross-sensor self-supervised training and alignment for remote sensing
V. Marsocci
Nicolas Audebert
33
1
0
16 May 2024
Feature Expansion and enhanced Compression for Class Incremental Learning
Quentin Ferdinand
G. Chenadec
Benoit Clement
Panagiotis Papadakis
Quentin Oliveau
CLL
22
0
0
13 May 2024
Attend, Distill, Detect: Attention-aware Entropy Distillation for Anomaly Detection
Sushovan Jena
Vishwas Saini
Ujjwal Shaw
Pavitra Jain
Abhay Singh Raihal
Anoushka Banerjee
Sharad Joshi
Ananth Ganesh
Arnav Bhavsar
32
0
0
10 May 2024
CKD: Contrastive Knowledge Distillation from A Sample-wise Perspective
Wencheng Zhu
Xin Zhou
Pengfei Zhu
Yu Wang
Qinghua Hu
VLM
64
1
0
22 Apr 2024
An Experimental Study on Exploring Strong Lightweight Vision Transformers via Masked Image Modeling Pre-Training
Jin Gao
Shubo Lin
Shaoru Wang
Yutong Kou
Zeming Li
Liang Li
Congxuan Zhang
Xiaoqin Zhang
Yizheng Wang
Weiming Hu
47
1
0
18 Apr 2024
Improve Knowledge Distillation via Label Revision and Data Selection
Weichao Lan
Yiu-ming Cheung
Qing Xu
Buhua Liu
Zhikai Hu
Mengke Li
Zhenghua Chen
37
2
0
03 Apr 2024
GeoAuxNet: Towards Universal 3D Representation Learning for Multi-sensor Point Clouds
Shengjun Zhang
Xin Fei
Yueqi Duan
3DPC
38
1
0
28 Mar 2024
CTSM: Combining Trait and State Emotions for Empathetic Response Model
Yufeng Wang
Chao Chen
Zhou Yang
Shuhui Wang
Xiangwen Liao
43
6
0
22 Mar 2024
Scale Decoupled Distillation
Shicai Wei
47
4
0
20 Mar 2024
V
k
D
:
V_kD:
V
k
D
:
Improving Knowledge Distillation using Orthogonal Projections
Roy Miles
Ismail Elezi
Jiankang Deng
52
10
0
10 Mar 2024
Bit-mask Robust Contrastive Knowledge Distillation for Unsupervised Semantic Hashing
Liyang He
Zhenya Huang
Jiayu Liu
Enhong Chen
Fei-Yue Wang
Jing Sha
Shijin Wang
18
5
0
10 Mar 2024
Precise Knowledge Transfer via Flow Matching
Shitong Shao
Zhiqiang Shen
Linrui Gong
Huanran Chen
Xu Dai
32
2
0
03 Feb 2024
SlimSAM: 0.1% Data Makes Segment Anything Slim
Zigeng Chen
Gongfan Fang
Xinyin Ma
Xinchao Wang
33
13
0
08 Dec 2023
Augmentation-Free Dense Contrastive Knowledge Distillation for Efficient Semantic Segmentation
Jiawei Fan
Chao Li
Xiaolong Liu
Meina Song
Anbang Yao
25
5
0
07 Dec 2023
Comparative Knowledge Distillation
Alex Wilf
Alex Tianyi Xu
Paul Pu Liang
A. Obolenskiy
Daniel Fried
Louis-Philippe Morency
VLM
18
1
0
03 Nov 2023
torchdistill Meets Hugging Face Libraries for Reproducible, Coding-Free Deep Learning Studies: A Case Study on NLP
Yoshitomo Matsubara
VLM
26
1
0
26 Oct 2023
Understanding the Effects of Projectors in Knowledge Distillation
Yudong Chen
Sen Wang
Jiajun Liu
Xuwei Xu
Frank de Hoog
Brano Kusy
Zi Huang
26
0
0
26 Oct 2023
Towards Anytime Fine-tuning: Continually Pre-trained Language Models with Hypernetwork Prompt
Gangwei Jiang
Caigao Jiang
Siqiao Xue
James Y. Zhang
Junqing Zhou
Defu Lian
Ying Wei
VLM
32
7
0
19 Oct 2023
Unsupervised Pretraining for Fact Verification by Language Model Distillation
A. Bazaga
Pietro Lió
Bo Dai
HILM
33
2
0
28 Sep 2023
A Sentence Speaks a Thousand Images: Domain Generalization through Distilling CLIP with Language Guidance
Zeyi Huang
Andy Zhou
Zijian Lin
Mu Cai
Haohan Wang
Yong Jae Lee
VLM
OOD
32
28
0
21 Sep 2023
Heterogeneous Generative Knowledge Distillation with Masked Image Modeling
Ziming Wang
Shumin Han
Xiaodi Wang
Jing Hao
Xianbin Cao
Baochang Zhang
VLM
32
0
0
18 Sep 2023
Self-Training and Multi-Task Learning for Limited Data: Evaluation Study on Object Detection
Hoàng-Ân Lê
Minh-Tan Pham
37
2
0
12 Sep 2023
MoMA: Momentum Contrastive Learning with Multi-head Attention-based Knowledge Distillation for Histopathology Image Analysis
T. Vuong
J. T. Kwak
41
6
0
31 Aug 2023
Teacher-Student Architecture for Knowledge Distillation: A Survey
Chengming Hu
Xuan Li
Danyang Liu
Haolun Wu
Xi Chen
Ju Wang
Xue Liu
21
16
0
08 Aug 2023
A Dimensional Structure based Knowledge Distillation Method for Cross-Modal Learning
Hui Xiong
Hongwei Dong
Jingyao Wang
J. Yu
Wen-jie Zhai
Changwen Zheng
Fanjiang Xu
Gang Hua
24
1
0
28 Jun 2023
Categories of Response-Based, Feature-Based, and Relation-Based Knowledge Distillation
Chuanguang Yang
Xinqiang Yu
Zhulin An
Yongjun Xu
VLM
OffRL
86
22
0
19 Jun 2023
Triplet Knowledge Distillation
Xijun Wang
Dongyang Liu
Meina Kan
Chunrui Han
Zhongqin Wu
Shiguang Shan
29
3
0
25 May 2023
On the Impact of Knowledge Distillation for Model Interpretability
Hyeongrok Han
Siwon Kim
Hyun-Soo Choi
Sungroh Yoon
24
4
0
25 May 2023
Revisiting Token Dropping Strategy in Efficient BERT Pretraining
Qihuang Zhong
Liang Ding
Juhua Liu
Xuebo Liu
Min Zhang
Bo Du
Dacheng Tao
VLM
34
9
0
24 May 2023
NORM: Knowledge Distillation via N-to-One Representation Matching
Xiaolong Liu
Lujun Li
Chao Li
Anbang Yao
55
68
0
23 May 2023
DeepAqua: Self-Supervised Semantic Segmentation of Wetland Surface Water Extent with SAR Images using Knowledge Distillation
Francisco J. Peña
Clara Hubinger
A. H. Payberah
F. Jaramillo
25
0
0
02 May 2023
Function-Consistent Feature Distillation
Dongyang Liu
Meina Kan
Shiguang Shan
Xilin Chen
44
18
0
24 Apr 2023
360
∘
^\circ
∘
High-Resolution Depth Estimation via Uncertainty-aware Structural Knowledge Transfer
Zidong Cao
Hao Ai
Athanasios V. Vasilakos
Lin Wang
21
1
0
17 Apr 2023
Multi-Mode Online Knowledge Distillation for Self-Supervised Visual Representation Learning
Kaiyou Song
Jin Xie
Shanyi Zhang
Zimeng Luo
30
29
0
13 Apr 2023
A Survey on Recent Teacher-student Learning Studies
Min Gao
23
3
0
10 Apr 2023
Doubly Stochastic Models: Learning with Unbiased Label Noises and Inference Stability
Haoyi Xiong
Xuhong Li
Bo Yu
Zhanxing Zhu
Dongrui Wu
Dejing Dou
NoLa
9
0
0
01 Apr 2023
1
2
3
Next