Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1706.00384
Cited By
Deep Mutual Learning
1 June 2017
Ying Zhang
Tao Xiang
Timothy M. Hospedales
Huchuan Lu
FedML
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Deep Mutual Learning"
50 / 710 papers shown
Title
Practical Edge Detection via Robust Collaborative Learning
Yuanbin Fu
Xiaojie Guo
30
9
0
27 Aug 2023
Computation-efficient Deep Learning for Computer Vision: A Survey
Yulin Wang
Yizeng Han
Chaofei Wang
Shiji Song
Qi Tian
Gao Huang
VLM
41
20
0
27 Aug 2023
Unsupervised Domain Adaptation via Domain-Adaptive Diffusion
Duo Peng
Qiuhong Ke
Yinjie Lei
Jing Liu
DiffM
29
12
0
26 Aug 2023
SamDSK: Combining Segment Anything Model with Domain-Specific Knowledge for Semi-Supervised Learning in Medical Image Segmentation
Yizhe Zhang
Tao Zhou
Shuo Wang
Ye Wu
Pengfei Gu
Da Chen
38
10
0
26 Aug 2023
Dual Compensation Residual Networks for Class Imbalanced Learning
Rui Hou
Hong Chang
Bingpeng Ma
Shiguang Shan
Xilin Chen
30
5
0
25 Aug 2023
Asymmetric Co-Training with Explainable Cell Graph Ensembling for Histopathological Image Classification
Ziqi Yang
Zhongyu Li
Chen Liu
Xiangde Luo
Xingguang Wang
Dou Xu
Chao-Ting Li
Xiaoying Qin
Meng Yang
Long Jin
40
0
0
24 Aug 2023
MDCS: More Diverse Experts with Consistency Self-distillation for Long-tailed Recognition
QiHao Zhao
Chen Jiang
Wei Hu
Fangying Zhang
Jun Liu
35
11
0
19 Aug 2023
Towards Personalized Federated Learning via Heterogeneous Model Reassembly
Jiaqi Wang
Xingyi Yang
Suhan Cui
Liwei Che
Lingjuan Lyu
Dongkuan Xu
Fenglong Ma
FedML
20
45
0
16 Aug 2023
Multi-Label Knowledge Distillation
Penghui Yang
Ming-Kun Xie
Chen-Chen Zong
Lei Feng
Gang Niu
Masashi Sugiyama
Sheng-Jun Huang
41
10
0
12 Aug 2023
Multi-View Fusion and Distillation for Subgrade Distresses Detection based on 3D-GPR
Chunpeng Zhou
Kang Ning
Haishuai Wang
Zhi Yu
Sheng Zhou
Jiajun Bu
27
1
0
09 Aug 2023
Teacher-Student Architecture for Knowledge Distillation: A Survey
Chengming Hu
Xuan Li
Danyang Liu
Haolun Wu
Xi Chen
Ju Wang
Xue Liu
26
16
0
08 Aug 2023
R-Block: Regularized Block of Dropout for convolutional networks
Liqi Wang
Qiyang Hu
14
0
0
27 Jul 2023
Training-based Model Refinement and Representation Disagreement for Semi-Supervised Object Detection
S. M. Marvasti-Zadeh
Nilanjan Ray
Nadir Erbilgin
29
1
0
25 Jul 2023
SwinMM: Masked Multi-view with Swin Transformers for 3D Medical Image Segmentation
Yiqing Wang
Zihan Li
Jieru Mei
Zi-Ying Wei
Li Liu
Chen Wang
Shengtian Sang
Alan Yuille
Cihang Xie
Yuyin Zhou
ViT
MedIm
20
31
0
24 Jul 2023
A Good Student is Cooperative and Reliable: CNN-Transformer Collaborative Learning for Semantic Segmentation
Jinjing Zhu
Yuan Luo
Xueye Zheng
Hao Wang
Lin Wang
25
33
0
24 Jul 2023
Frameless Graph Knowledge Distillation
Dai Shi
Zhiqi Shao
Yi Guo
Junbin Gao
39
4
0
13 Jul 2023
The Staged Knowledge Distillation in Video Classification: Harmonizing Student Progress by a Complementary Weakly Supervised Framework
Chao Wang
Zhenghang Tang
38
1
0
11 Jul 2023
Test-Time Adaptation for Nighttime Color-Thermal Semantic Segmentation
Yexin Liu
Weiming Zhang
Guoyang Zhao
Jinjing Zhu
Athanasios V. Vasilakos
Lin Wang
TTA
23
3
0
10 Jul 2023
Towards Brain Inspired Design for Addressing the Shortcomings of ANNs
F. Sarfraz
Elahe Arani
Bahram Zonooz
27
1
0
30 Jun 2023
Miniaturized Graph Convolutional Networks with Topologically Consistent Pruning
H. Sahbi
28
0
0
30 Jun 2023
NCL++: Nested Collaborative Learning for Long-Tailed Visual Recognition
Zichang Tan
Jun Yu Li
Jinhao Du
Jun Wan
Zhen Lei
Guodong Guo
VLM
35
21
0
29 Jun 2023
Deep Transfer Learning for Intelligent Vehicle Perception: a Survey
Xinyi Liu
Jinlong Li
Jin Ma
Huiming Sun
Zhigang Xu
Tianyu Zhang
Hongkai Yu
61
23
0
26 Jun 2023
Efficient Online Processing with Deep Neural Networks
Lukas Hedegaard
31
0
0
23 Jun 2023
CrossKD: Cross-Head Knowledge Distillation for Object Detection
Jiabao Wang
Yuming Chen
Zhaohui Zheng
Xiang Li
Ming-Ming Cheng
Qibin Hou
51
33
0
20 Jun 2023
Categories of Response-Based, Feature-Based, and Relation-Based Knowledge Distillation
Chuanguang Yang
Xinqiang Yu
Zhulin An
Yongjun Xu
VLM
OffRL
96
23
0
19 Jun 2023
EM-Network: Oracle Guided Self-distillation for Sequence Learning
J. Yoon
Sunghwan Ahn
Hyeon Seung Lee
Minchan Kim
Seokhwan Kim
N. Kim
VLM
40
2
0
14 Jun 2023
Population-Based Evolutionary Gaming for Unsupervised Person Re-identification
Yunpeng Zhai
Peixi Peng
Mengxi Jia
Shiyong Li
Weiqiang Chen
Xuesong Gao
Yonghong Tian
33
14
0
08 Jun 2023
Optimal Transport Model Distributional Robustness
Van-Anh Nguyen
Trung Le
Anh Tuan Bui
Thanh-Toan Do
Dinh Q. Phung
OOD
34
3
0
07 Jun 2023
Inconsistency, Instability, and Generalization Gap of Deep Neural Network Training
Rie Johnson
Tong Zhang
11
5
0
31 May 2023
Budget-Aware Graph Convolutional Network Design using Probabilistic Magnitude Pruning
H. Sahbi
23
0
0
30 May 2023
Incomplete Multimodal Learning for Complex Brain Disorders Prediction
Reza Shirkavand
Liang Zhan
Heng-Chiao Huang
Li Shen
Paul M. Thompson
19
3
0
25 May 2023
Triplet Knowledge Distillation
Xijun Wang
Dongyang Liu
Meina Kan
Chunrui Han
Zhongqin Wu
Shiguang Shan
42
3
0
25 May 2023
Towards Higher Pareto Frontier in Multilingual Machine Translation
Yi-Chong Huang
Xiaocheng Feng
Xinwei Geng
Baohang Li
Bing Qin
43
9
0
25 May 2023
Disentangled Phonetic Representation for Chinese Spelling Correction
Zihong Liang
Xiaojun Quan
Qifan Wang
20
17
0
24 May 2023
Decoupled Kullback-Leibler Divergence Loss
Jiequan Cui
Zhuotao Tian
Zhisheng Zhong
Xiaojuan Qi
Bei Yu
Hanwang Zhang
41
38
0
23 May 2023
NORM: Knowledge Distillation via N-to-One Representation Matching
Xiaolong Liu
Lujun Li
Chao Li
Anbang Yao
55
68
0
23 May 2023
Is Synthetic Data From Diffusion Models Ready for Knowledge Distillation?
Zheng Li
Yuxuan Li
Penghai Zhao
Renjie Song
Xiang Li
Jian Yang
34
19
0
22 May 2023
Self-Distillation with Meta Learning for Knowledge Graph Completion
Yunshui Li
Junhao Liu
Chengming Li
Min Yang
29
5
0
20 May 2023
Privacy in Multimodal Federated Human Activity Recognition
Alexandru Iacob
Pedro Gusmão
Nicholas D. Lane
Armand K. Koupai
M. J. Bocus
Raúl Santos-Rodríguez
Robert Piechocki
Ryan McConville
23
1
0
20 May 2023
DisCo: Distilled Student Models Co-training for Semi-supervised Text Mining
Weifeng Jiang
Qianren Mao
Chenghua Lin
Jianxin Li
Ting Deng
Weiyi Yang
Zihan Wang
18
2
0
20 May 2023
Student-friendly Knowledge Distillation
Mengyang Yuan
Bo Lang
Fengnan Quan
22
17
0
18 May 2023
Tailoring Instructions to Student's Learning Levels Boosts Knowledge Distillation
Yuxin Ren
Zi-Qi Zhong
Xingjian Shi
Yi Zhu
Chun Yuan
Mu Li
29
7
0
16 May 2023
GeNAS: Neural Architecture Search with Better Generalization
Joonhyun Jeong
Joonsang Yu
Geondo Park
Dongyoon Han
Y. Yoo
30
4
0
15 May 2023
Robust Saliency-Aware Distillation for Few-shot Fine-grained Visual Recognition
Haiqi Liu
Chong Chen
Xinrong Gong
Tong Zhang
40
10
0
12 May 2023
EAML: Ensemble Self-Attention-based Mutual Learning Network for Document Image Classification
Souhail Bakkali
Zuheng Ming
Mickael Coustaty
Marçal Rusiñol
10
6
0
11 May 2023
FedPDD: A Privacy-preserving Double Distillation Framework for Cross-silo Federated Recommendation
Sheng Wan
Dashan Gao
Hanlin Gu
Daning Hu
FedML
16
7
0
09 May 2023
DynamicKD: An Effective Knowledge Distillation via Dynamic Entropy Correction-Based Distillation for Gap Optimizing
Songling Zhu
Ronghua Shang
Bo Yuan
Weitong Zhang
Yangyang Li
Licheng Jiao
35
7
0
09 May 2023
Bi-Mapper: Holistic BEV Semantic Mapping for Autonomous Driving
Siyu Li
Kailun Yang
Haowen Shi
Jiaming Zhang
Jiacheng Lin
Zhifeng Teng
Zhiyong Li
34
14
0
07 May 2023
Smaller3d: Smaller Models for 3D Semantic Segmentation Using Minkowski Engine and Knowledge Distillation Methods
Alen Adamyan
Erik Harutyunyan
3DPC
21
2
0
04 May 2023
Stimulative Training++: Go Beyond The Performance Limits of Residual Networks
XinYu Piao
Tong He
DoangJoo Synn
Baopu Li
Tao Chen
Lei Bai
Jong-Kook Kim
52
4
0
04 May 2023
Previous
1
2
3
4
5
...
13
14
15
Next