Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1706.00384
Cited By
Deep Mutual Learning
1 June 2017
Ying Zhang
Tao Xiang
Timothy M. Hospedales
Huchuan Lu
FedML
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Deep Mutual Learning"
50 / 710 papers shown
Title
Self Distillation via Iterative Constructive Perturbations
Maheak Dave
Aniket K. Singh
Aryan Pareek
Harshita Jha
Debasis Chaudhuri
Manish P. Singh
ODL
22
0
0
20 May 2025
Intra-class Patch Swap for Self-Distillation
Hongjun Choi
Eun Som Jeon
Ankita Shukla
Pavan Turaga
7
0
0
20 May 2025
FiGKD: Fine-Grained Knowledge Distillation via High-Frequency Detail Transfer
Seonghak Kim
14
0
0
17 May 2025
MoKD: Multi-Task Optimization for Knowledge Distillation
Zeeshan Hayder
A. Cheraghian
Lars Petersson
Mehrtash Harandi
VLM
54
0
0
13 May 2025
How to Backdoor the Knowledge Distillation
C. Wu
Qian Ma
P. Mitra
Sencun Zhu
AAML
32
0
0
30 Apr 2025
Swapped Logit Distillation via Bi-level Teacher Alignment
Stephen Ekaputra Limantoro
Jhe-Hao Lin
Chih-Yu Wang
Yi-Lung Tsai
Hong-Han Shuai
Ching-Chun Huang
Wen-Huang Cheng
54
0
0
27 Apr 2025
Collaborative Learning of On-Device Small Model and Cloud-Based Large Model: Advances and Future Directions
Chaoyue Niu
Yucheng Ding
Junhui Lu
Zhengxiang Huang
Hang Zeng
Yutong Dai
Xuezhen Tu
Chengfei Lv
Fan Wu
Guihai Chen
35
1
0
17 Apr 2025
Generative Classifier for Domain Generalization
Shaocong Long
Qianyu Zhou
Xuelong Li
Chenhao Ying
Yunhai Tong
Lizhuang Ma
Yuan Luo
Dacheng Tao
41
0
0
03 Apr 2025
v-CLR: View-Consistent Learning for Open-World Instance Segmentation
Chang-Bin Zhang
Jinhong Ni
Yujie Zhong
Kai Han
3DV
VLM
69
0
0
02 Apr 2025
An Efficient Training Algorithm for Models with Block-wise Sparsity
Ding Zhu
Zhiqun Zuo
Mohammad Mahdi Khalili
42
0
0
27 Mar 2025
Bezier Distillation
Ling Feng
SK Yang
44
0
0
20 Mar 2025
Asymmetric Decision-Making in Online Knowledge Distillation:Unifying Consensus and Divergence
Zhaowei Chen
Borui Zhao
Yuchen Ge
Yuhao Chen
Renjie Song
Jiajun Liang
50
0
0
09 Mar 2025
Causality Enhanced Origin-Destination Flow Prediction in Data-Scarce Cities
Tao Feng
Yunke Zhang
Huandong Wang
Yong Li
231
0
0
09 Mar 2025
Federated Learning Framework via Distributed Mutual Learning
Yash Gupta
FedML
48
0
0
03 Mar 2025
FedMHO: Heterogeneous One-Shot Federated Learning Towards Resource-Constrained Edge Devices
Dezhong Yao
Yuexin Shi
Tongtong Liu
Zhiqiang Xu
69
1
0
12 Feb 2025
Rethinking Knowledge in Distillation: An In-context Sample Retrieval Perspective
Jinjing Zhu
Songze Li
Lin Wang
57
0
0
13 Jan 2025
Towards Mitigating Architecture Overfitting on Distilled Datasets
Xuyang Zhong
Chen Liu
DD
55
0
0
08 Jan 2025
Knowledge Distillation with Adapted Weight
Sirong Wu
Xi Luo
Junjie Liu
Yuhui Deng
48
0
0
06 Jan 2025
Cross-View Consistency Regularisation for Knowledge Distillation
W. Zhang
Dongnan Liu
Weidong Cai
Chao Ma
75
1
0
21 Dec 2024
LiRCDepth: Lightweight Radar-Camera Depth Estimation via Knowledge Distillation and Uncertainty Guidance
Huawei Sun
Nastassia Vysotskaya
Tobias Sukianto
Hao Feng
Julius Ott
Xiangyuan Peng
Lorenzo Servadei
Robert Wille
79
0
0
20 Dec 2024
OccScene: Semantic Occupancy-based Cross-task Mutual Learning for 3D Scene Generation
Bohan Li
Xin Jin
Rongxiang Weng
Yukai Shi
Yasheng Sun
...
Zhuang Ma
Baao Xie
Chao Ma
Xiaokang Yang
Wenjun Zeng
DiffM
246
1
0
15 Dec 2024
Multi-Branch Mutual-Distillation Transformer for EEG-Based Seizure Subtype Classification
Ruimin Peng
Zhenbang Du
Changming Zhao
Jingwei Luo
Wenzhong Liu
Xinxing Chen
Dongrui Wu
MedIm
95
8
0
04 Dec 2024
Lightweight Contenders: Navigating Semi-Supervised Text Mining through Peer Collaboration and Self Transcendence
Qianren Mao
Weifeng Jiang
Qingbin Liu
Chenghua Lin
Qian Li
Xianqing Wen
Jianxin Li
Jinhu Lu
72
0
0
01 Dec 2024
Knowledge-Data Fusion Based Source-Free Semi-Supervised Domain Adaptation for Seizure Subtype Classification
Ruimin Peng
Jiayu An
Dongrui Wu
86
1
0
29 Nov 2024
When Babies Teach Babies: Can student knowledge sharing outperform Teacher-Guided Distillation on small datasets?
Srikrishna Iyer
FedML
82
0
0
25 Nov 2024
Prior-based Objective Inference Mining Potential Uncertainty for Facial Expression Recognition
Hanwei Liu
Huiling Cai
Qingcheng Lin
Xuefeng Li
Hui Xiao
3DH
78
0
0
20 Nov 2024
KDC-MAE: Knowledge Distilled Contrastive Mask Auto-Encoder
Maheswar Bora
Saurabh Atreya
Aritra Mukherjee
Abhijit Das
92
0
0
19 Nov 2024
Dual-Head Knowledge Distillation: Enhancing Logits Utilization with an Auxiliary Head
Penghui Yang
Chen-Chen Zong
Sheng-Jun Huang
Lei Feng
Bo An
43
1
0
13 Nov 2024
GazeGen: Gaze-Driven User Interaction for Visual Content Generation
He-Yen Hsieh
Ziyun Li
Sai Qian Zhang
W. Ting
Kao-Den Chang
B. D. Salvo
Chiao Liu
H. T. Kung
VGen
35
0
0
07 Nov 2024
Self-supervised cross-modality learning for uncertainty-aware object detection and recognition in applications which lack pre-labelled training data
Irum Mehboob
Li Sun
Alireza Astegarpanah
Rustam Stolkin
UQCV
48
0
0
05 Nov 2024
Decoupling Dark Knowledge via Block-wise Logit Distillation for Feature-level Alignment
Chengting Yu
Fengzhao Zhang
Ruizhe Chen
Zuozhu Liu
Shurun Tan
Er-ping Li
Aili Wang
44
2
0
03 Nov 2024
Enhancing Neural Network Interpretability with Feature-Aligned Sparse Autoencoders
Luke Marks
Alasdair Paren
David M. Krueger
Fazl Barez
AAML
29
4
0
02 Nov 2024
Multiple Information Prompt Learning for Cloth-Changing Person Re-Identification
Shengxun Wei
Zan Gao
Yibo Zhao
Weili Guan
Weili Guan
Shengyong Chen
56
2
0
01 Nov 2024
GSSF: Generalized Structural Sparse Function for Deep Cross-modal Metric Learning
Haiwen Diao
Ying Zhang
Shang Gao
Jiawen Zhu
Long Chen
Huchuan Lu
34
4
0
20 Oct 2024
Browsing without Third-Party Cookies: What Do You See?
Maxwell Lin
Shihan Lin
Helen Wu
Karen Wang
Xiaowei Yang
BDL
59
0
0
14 Oct 2024
Adaptive Guidance for Local Training in Heterogeneous Federated Learning
Jianqing Zhang
Yang Liu
Yang Hua
Jian Cao
Qiang Yang
FedML
39
0
0
09 Oct 2024
Collaborative Knowledge Distillation via a Learning-by-Education Node Community
Anestis Kaimakamidis
Ioannis Mademlis
Ioannis Pitas
30
0
0
30 Sep 2024
Classroom-Inspired Multi-Mentor Distillation with Adaptive Learning Strategies
Shalini Sarode
Muhammad Saif Ullah Khan
Tahira Shehzadi
Didier Stricker
Muhammad Zeshan Afzal
46
0
0
30 Sep 2024
Student-Oriented Teacher Knowledge Refinement for Knowledge Distillation
Yaxin Peng
Yaomin Huang
Haokun Zhu
Jinsong Fan
Guixu Zhang
36
0
0
27 Sep 2024
Harmonizing knowledge Transfer in Neural Network with Unified Distillation
Yaomin Huang
Zaomin Yan
Yaxin Peng
Faming Fang
Guixu Zhang
34
0
0
27 Sep 2024
Trustworthy AI: Securing Sensitive Data in Large Language Models
G. Feretzakis
V. Verykios
29
10
0
26 Sep 2024
Online Multi-level Contrastive Representation Distillation for Cross-Subject fNIRS Emotion Recognition
Zhili Lai
Chunmei Qing
Junpeng Tan
Wanxiang Luo
Xiangmin Xu
28
1
0
24 Sep 2024
Simple Unsupervised Knowledge Distillation With Space Similarity
Aditya Singh
Haohan Wang
31
1
0
20 Sep 2024
Exploring and Enhancing the Transfer of Distribution in Knowledge Distillation for Autoregressive Language Models
Jun Rao
Xuebo Liu
Zepeng Lin
Liang Ding
Jing Li
Dacheng Tao
Min Zhang
44
2
0
19 Sep 2024
CNN-Transformer Rectified Collaborative Learning for Medical Image Segmentation
Lanhu Wu
Miao Zhang
Yongri Piao
Zhenyan Yao
Weibing Sun
Feng Tian
Huchuan Lu
ViT
MedIm
37
1
0
25 Aug 2024
PRG: Prompt-Based Distillation Without Annotation via Proxy Relational Graph
Yijin Xu
Jialun Liu
Hualiang Wei
Wenhui Li
38
0
0
22 Aug 2024
Computer Vision Model Compression Techniques for Embedded Systems: A Survey
Alexandre Lopes
Fernando Pereira dos Santos
D. Oliveira
Mauricio Schiezaro
Hélio Pedrini
36
5
0
15 Aug 2024
Deep Companion Learning: Enhancing Generalization Through Historical Consistency
Ruizhao Zhu
Venkatesh Saligrama
FedML
40
0
0
26 Jul 2024
DFMSD: Dual Feature Masking Stage-wise Knowledge Distillation for Object Detection
Zhourui Zhang
Jun Li
Zhijian Wu
Jifeng Shen
Jianhua Xu
38
0
0
18 Jul 2024
Deep Mutual Learning among Partially Labeled Datasets for Multi-Organ Segmentation
Xiaoyu Liu
Linhao Qu
Ziyue Xie
Yonghong Shi
Zhijian Song
46
1
0
17 Jul 2024
1
2
3
4
...
13
14
15
Next