Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2205.10536
Cited By
v1
v2
v3 (latest)
Knowledge Distillation from A Stronger Teacher
21 May 2022
Tao Huang
Shan You
Fei Wang
Chao Qian
Chang Xu
Re-assign community
ArXiv (abs)
PDF
HTML
Github (146★)
Papers citing
"Knowledge Distillation from A Stronger Teacher"
50 / 131 papers shown
Title
Relational Representation Distillation
Nikolaos Giakoumoglou
Tania Stathaki
125
0
0
16 Jul 2024
HDKD: Hybrid Data-Efficient Knowledge Distillation Network for Medical Image Classification
Omar S. El-Assiouti
Ghada Hamed
Dina Khattab
H. M. Ebied
91
3
0
10 Jul 2024
Topological Persistence Guided Knowledge Distillation for Wearable Sensor Data
Eun Som Jeon
Hongjun Choi
A. Shukla
Yuan Wang
Hyunglae Lee
M. Buman
Pavan Turaga
66
3
0
07 Jul 2024
Direct Preference Knowledge Distillation for Large Language Models
Yixing Li
Yuxian Gu
Li Dong
Dequan Wang
Yu Cheng
Furu Wei
121
8
0
28 Jun 2024
InFiConD: Interactive No-code Fine-tuning with Concept-based Knowledge Distillation
Jinbin Huang
Wenbin He
Liang Gou
Liu Ren
Chris Bryan
115
0
0
25 Jun 2024
D2LLM: Decomposed and Distilled Large Language Models for Semantic Search
Zihan Liao
Hang Yu
Jianguo Li
Jun Wang
Wei Zhang
70
5
0
25 Jun 2024
HLQ: Fast and Efficient Backpropagation via Hadamard Low-rank Quantization
Seonggon Kim
Eunhyeok Park
90
2
0
21 Jun 2024
BiLD: Bi-directional Logits Difference Loss for Large Language Model Distillation
Minchong Li
Feng Zhou
Xiaohui Song
61
3
0
19 Jun 2024
Enhancing Single-Slice Segmentation with 3D-to-2D Unpaired Scan Distillation
Xin Yu
Qi Yang
Han Liu
Ho Hin Lee
Yucheng Tang
...
Shunxing Bao
Yuankai Huo
Ann Zenobia Moore
Luigi Ferrucci
Bennett A. Landman
64
2
0
18 Jun 2024
Estimating Human Poses Across Datasets: A Unified Skeleton and Multi-Teacher Distillation Approach
Muhammad Gul Zain Ali Khan
Dhavalkumar Limbachiya
Didier Stricker
Muhammad Zeshan Afzal
3DH
98
0
0
30 May 2024
Relation Modeling and Distillation for Learning with Noisy Labels
Xiaming Chen
Junlin Zhang
Zhuang Qi
Xin Qi
NoLa
110
0
0
30 May 2024
Aligning in a Compact Space: Contrastive Knowledge Distillation between Heterogeneous Architectures
Hongjun Wu
Li Xiao
Xingkuo Zhang
Yining Miao
109
1
0
28 May 2024
Label-efficient Semantic Scene Completion with Scribble Annotations
Song Wang
Jiawei Yu
Wentong Li
Hao Shi
Kailun Yang
Junbo Chen
Jianke Zhu
116
5
0
24 May 2024
Exploring Dark Knowledge under Various Teacher Capacities and Addressing Capacity Mismatch
Wen-Shu Fan
Xin-Chun Li
Bowen Tao
79
2
0
21 May 2024
Fully Exploiting Every Real Sample: SuperPixel Sample Gradient Model Stealing
Yunlong Zhao
Xiaoheng Deng
Yijing Liu
Xin-jun Pei
Jiazhi Xia
Wei Chen
AAML
50
3
0
18 May 2024
From Algorithm to Hardware: A Survey on Efficient and Safe Deployment of Deep Neural Networks
Xue Geng
Zhe Wang
Chunyun Chen
Qing Xu
Kaixin Xu
...
Zhenghua Chen
M. Aly
Jie Lin
Min-man Wu
Xiaoli Li
87
1
0
09 May 2024
IPixMatch: Boost Semi-supervised Semantic Segmentation with Inter-Pixel Relation
Kebin Wu
Wenbin Li
Xiaofei Xiao
52
1
0
29 Apr 2024
Revealing the Two Sides of Data Augmentation: An Asymmetric Distillation-based Win-Win Solution for Open-Set Recognition
Yunbing Jia
Xiaoyu Kong
Fan Tang
Yixing Gao
Weiming Dong
Yi Yang
97
3
0
28 Apr 2024
Promoting CNNs with Cross-Architecture Knowledge Distillation for Efficient Monocular Depth Estimation
Zhimeng Zheng
Tao Huang
Gongsheng Li
Zuyi Wang
97
2
0
25 Apr 2024
CKD: Contrastive Knowledge Distillation from A Sample-wise Perspective
Wencheng Zhu
Xin Zhou
Pengfei Zhu
Yu Wang
Qinghua Hu
VLM
147
1
0
22 Apr 2024
Robust feature knowledge distillation for enhanced performance of lightweight crack segmentation models
Zhaohui Chen
Elyas Asadi Shamsabadi
Sheng Jiang
Luming Shen
Daniel Dias-da-Costa
51
2
0
09 Apr 2024
Diverse and Tailored Image Generation for Zero-shot Multi-label Classification
Kai Zhang
Zhixiang Yuan
Tao Huang
VLM
82
4
0
04 Apr 2024
Learning to Project for Cross-Task Knowledge Distillation
Dylan Auty
Roy Miles
Benedikt Kolbeinsson
K. Mikolajczyk
87
0
0
21 Mar 2024
Histo-Genomic Knowledge Distillation For Cancer Prognosis From Histopathology Whole Slide Images
Zhikang Wang
Yumeng Zhang
Yingxue Xu
S. Imoto
Hao Chen
Jiangning Song
59
7
0
15 Mar 2024
DiTMoS: Delving into Diverse Tiny-Model Selection on Microcontrollers
Xiao Ma
Shengfeng He
Hezhe Qiao
Dong-Lai Ma
83
1
0
14 Mar 2024
LIX: Implicitly Infusing Spatial Geometric Prior Knowledge into Visual Semantic Segmentation for Autonomous Driving
Sicen Guo
Zhiyuan Wu
Qijun Chen
Ioannis Pitas
Rui Fan
Rui Fan
115
1
0
13 Mar 2024
Attention-guided Feature Distillation for Semantic Segmentation
Amir M. Mansourian
Arya Jalali
Rozhan Ahmadi
S. Kasaei
226
0
0
08 Mar 2024
Logit Standardization in Knowledge Distillation
Shangquan Sun
Wenqi Ren
Jingzhi Li
Rui Wang
Xiaochun Cao
132
77
0
03 Mar 2024
Weakly Supervised Monocular 3D Detection with a Single-View Image
Xue-Qiu Jiang
Sheng Jin
Lewei Lu
Xiaoqin Zhang
Shijian Lu
93
6
0
29 Feb 2024
Knowledge Distillation Based on Transformed Teacher Matching
Kaixiang Zheng
En-Hui Yang
115
21
0
17 Feb 2024
Data-efficient Large Vision Models through Sequential Autoregression
Jianyuan Guo
Zhiwei Hao
Chengcheng Wang
Yehui Tang
Han Wu
Han Hu
Kai Han
Chang Xu
VLM
110
10
0
07 Feb 2024
Precise Knowledge Transfer via Flow Matching
Shitong Shao
Zhiqiang Shen
Linrui Gong
Huanran Chen
Xu Dai
75
2
0
03 Feb 2024
Self-supervised Video Object Segmentation with Distillation Learning of Deformable Attention
Quang-Trung Truong
Duc Thanh Nguyen
Binh-Son Hua
Sai-Kit Yeung
VOS
60
2
0
25 Jan 2024
Rethinking Centered Kernel Alignment in Knowledge Distillation
Zikai Zhou
Yunhang Shen
Shitong Shao
Linrui Gong
Shaohui Lin
109
4
0
22 Jan 2024
Generative Denoise Distillation: Simple Stochastic Noises Induce Efficient Knowledge Transfer for Dense Prediction
Zhaoge Liu
Xiaohao Xu
Yunkang Cao
Nong Sang
VLM
79
0
0
16 Jan 2024
TelME: Teacher-leading Multimodal Fusion Network for Emotion Recognition in Conversation
Taeyang Yun
Hyunkuk Lim
Jeong-Hoon Lee
Min Song
75
12
0
16 Jan 2024
Transferring Core Knowledge via Learngenes
Fu Feng
Jing Wang
Xin Geng
111
7
0
16 Jan 2024
Direct Distillation between Different Domains
Jialiang Tang
Shuo Chen
Gang Niu
Hongyuan Zhu
Qiufeng Wang
Chen Gong
Masashi Sugiyama
130
3
0
12 Jan 2024
Towards Efficient and Effective Text-to-Video Retrieval with Coarse-to-Fine Visual Representation Learning
Kaibin Tian
Yanhua Cheng
Yi Liu
Xinglin Hou
Quan Chen
Han Li
56
5
0
01 Jan 2024
Less or More From Teacher: Exploiting Trilateral Geometry For Knowledge Distillation
Chengming Hu
Haolun Wu
Xuan Li
Chen Ma
Xi Chen
Jun Yan
Boyu Wang
Xue Liu
106
3
0
22 Dec 2023
RdimKD: Generic Distillation Paradigm by Dimensionality Reduction
Yi Guo
Yiqian He
Xiaoyang Li
Haotong Qin
Van Tung Pham
Yang Zhang
Shouda Liu
105
1
0
14 Dec 2023
Generalized Large-Scale Data Condensation via Various Backbone and Statistical Matching
Shitong Shao
Zeyuan Yin
Muxin Zhou
Xindong Zhang
Zhiqiang Shen
DD
121
31
0
29 Nov 2023
Robustness-Reinforced Knowledge Distillation with Correlation Distance and Network Pruning
Seonghak Kim
Gyeongdo Ham
Yucheol Cho
Daeshik Kim
127
4
0
23 Nov 2023
DONUT-hole: DONUT Sparsification by Harnessing Knowledge and Optimizing Learning Efficiency
Azhar Shaikh
Michael Cochez
Denis Diachkov
Michiel de Rijcke
Sahar Yousefi
79
0
0
09 Nov 2023
Preference-Consistent Knowledge Distillation for Recommender System
Zhangchi Zhu
Wei Zhang
23
0
0
08 Nov 2023
Comparative Knowledge Distillation
Alex Wilf
Alex Tianyi Xu
Paul Pu Liang
A. Obolenskiy
Daniel Fried
Louis-Philippe Morency
VLM
78
1
0
03 Nov 2023
Distilling Out-of-Distribution Robustness from Vision-Language Foundation Models
Andy Zhou
Jindong Wang
Yu-Xiong Wang
Haohan Wang
VLM
103
6
0
02 Nov 2023
One-for-All: Bridge the Gap Between Heterogeneous Architectures in Knowledge Distillation
Zhiwei Hao
Jianyuan Guo
Kai Han
Yehui Tang
Han Hu
Yunhe Wang
Chang Xu
113
72
0
30 Oct 2023
Understanding the Effects of Projectors in Knowledge Distillation
Yudong Chen
Sen Wang
Jiajun Liu
Xuwei Xu
Frank de Hoog
Brano Kusy
Zi Huang
94
0
0
26 Oct 2023
AST: Effective Dataset Distillation through Alignment with Smooth and High-Quality Expert Trajectories
Jiyuan Shen
Wenzhuo Yang
Kwok-Yan Lam
DD
101
1
0
16 Oct 2023
Previous
1
2
3
Next