ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1706.00384
  4. Cited By
Deep Mutual Learning

Deep Mutual Learning

1 June 2017
Ying Zhang
Tao Xiang
Timothy M. Hospedales
Huchuan Lu
    FedML
ArXivPDFHTML

Papers citing "Deep Mutual Learning"

50 / 710 papers shown
Title
Observations on K-image Expansion of Image-Mixing Augmentation for
  Classification
Observations on K-image Expansion of Image-Mixing Augmentation for Classification
Joonhyun Jeong
Sungmin Cha
Jongwon Choi
Sangdoo Yun
Taesup Moon
Y. Yoo
VLM
28
6
0
08 Oct 2021
Model Adaptation: Historical Contrastive Learning for Unsupervised
  Domain Adaptation without Source Data
Model Adaptation: Historical Contrastive Learning for Unsupervised Domain Adaptation without Source Data
Jiaxing Huang
Dayan Guan
Aoran Xiao
Shijian Lu
153
212
0
07 Oct 2021
Decoupled Adaptation for Cross-Domain Object Detection
Decoupled Adaptation for Cross-Domain Object Detection
Junguang Jiang
Baixu Chen
Jianmin Wang
Mingsheng Long
ObjD
71
64
0
06 Oct 2021
Personalized Retrogress-Resilient Framework for Real-World Medical
  Federated Learning
Personalized Retrogress-Resilient Framework for Real-World Medical Federated Learning
Zhen Chen
Meilu Zhu
Chen Yang
Yixuan Yuan
OOD
33
38
0
01 Oct 2021
Student Helping Teacher: Teacher Evolution via Self-Knowledge
  Distillation
Student Helping Teacher: Teacher Evolution via Self-Knowledge Distillation
Zheng Li
Xiang Li
Lingfeng Yang
Jian Yang
Zhigeng Pan
31
2
0
01 Oct 2021
Building an Efficient and Effective Retrieval-based Dialogue System via
  Mutual Learning
Building an Efficient and Effective Retrieval-based Dialogue System via Mutual Learning
Chongyang Tao
Jiazhan Feng
Chang Liu
Juntao Li
Xiubo Geng
Daxin Jiang
RALM
26
6
0
01 Oct 2021
Semi-Supervised Text Classification via Self-Pretraining
Semi-Supervised Text Classification via Self-Pretraining
Payam Karisani
Negin Karisani
SSL
VLM
28
22
0
30 Sep 2021
Metal Artifact Reduction in 2D CT Images with Self-supervised
  Cross-domain Learning
Metal Artifact Reduction in 2D CT Images with Self-supervised Cross-domain Learning
Lequan Yu
Zhicheng Zhang
Xiaomeng Li
Hongyi Ren
Wei Zhao
Lei Xing
MedIm
41
28
0
28 Sep 2021
Deep Structured Instance Graph for Distilling Object Detectors
Deep Structured Instance Graph for Distilling Object Detectors
Yixin Chen
Pengguang Chen
Shu Liu
Liwei Wang
Jiaya Jia
ObjD
ISeg
23
12
0
27 Sep 2021
Partial to Whole Knowledge Distillation: Progressive Distilling
  Decomposed Knowledge Boosts Student Better
Partial to Whole Knowledge Distillation: Progressive Distilling Decomposed Knowledge Boosts Student Better
Xuanyang Zhang
Xinming Zhang
Jian Sun
30
1
0
26 Sep 2021
LGD: Label-guided Self-distillation for Object Detection
LGD: Label-guided Self-distillation for Object Detection
Peizhen Zhang
Zijian Kang
Tong Yang
Xinming Zhang
N. Zheng
Jian Sun
ObjD
106
30
0
23 Sep 2021
Mutual Consistency Learning for Semi-supervised Medical Image
  Segmentation
Mutual Consistency Learning for Semi-supervised Medical Image Segmentation
Yicheng Wu
Z. Ge
Donghao Zhang
Minfeng Xu
Lei Zhang
Yong-quan Xia
Jianfei Cai
OOD
SSL
77
231
0
21 Sep 2021
Personalized Federated Learning for Heterogeneous Clients with Clustered
  Knowledge Transfer
Personalized Federated Learning for Heterogeneous Clients with Clustered Knowledge Transfer
Yae Jee Cho
Jianyu Wang
Tarun Chiruvolu
Gauri Joshi
FedML
42
31
0
16 Sep 2021
Partner-Assisted Learning for Few-Shot Image Classification
Partner-Assisted Learning for Few-Shot Image Classification
Jiawei Ma
Hanchen Xie
G. Han
Shih-Fu Chang
Aram Galstyan
Wael AbdAlmageed
VLM
34
66
0
15 Sep 2021
Unsupervised domain adaptation for cross-modality liver segmentation via
  joint adversarial learning and self-learning
Unsupervised domain adaptation for cross-modality liver segmentation via joint adversarial learning and self-learning
Jin Hong
S. Yu
Weitian Chen
OOD
MedIm
27
80
0
13 Sep 2021
PP-OCRv2: Bag of Tricks for Ultra Lightweight OCR System
PP-OCRv2: Bag of Tricks for Ultra Lightweight OCR System
Yuning Du
Chenxia Li
Ruoyu Guo
Cheng Cui
Weiwei Liu
...
Yehua Yang
Qiwen Liu
Xiaoguang Hu
Dianhai Yu
Yanjun Ma
19
66
0
07 Sep 2021
Knowledge Distillation Using Hierarchical Self-Supervision Augmented
  Distribution
Knowledge Distillation Using Hierarchical Self-Supervision Augmented Distribution
Chuanguang Yang
Zhulin An
Linhang Cai
Yongjun Xu
40
15
0
07 Sep 2021
ISyNet: Convolutional Neural Networks design for AI accelerator
ISyNet: Convolutional Neural Networks design for AI accelerator
Alexey Letunovskiy
Vladimir Korviakov
V. Polovnikov
Anastasiia Kargapoltseva
I. Mazurenko
Yepan Xiong
21
1
0
04 Sep 2021
Dual Transfer Learning for Event-based End-task Prediction via Pluggable
  Event to Image Translation
Dual Transfer Learning for Event-based End-task Prediction via Pluggable Event to Image Translation
Lin Wang
Yujeong Chae
Kuk-Jin Yoon
32
32
0
04 Sep 2021
FedKD: Communication Efficient Federated Learning via Knowledge
  Distillation
FedKD: Communication Efficient Federated Learning via Knowledge Distillation
Chuhan Wu
Fangzhao Wu
Lingjuan Lyu
Yongfeng Huang
Xing Xie
FedML
32
373
0
30 Aug 2021
LIGAR: Lightweight General-purpose Action Recognition
LIGAR: Lightweight General-purpose Action Recognition
Evgeny Izutov
15
3
0
30 Aug 2021
Rethinking the Misalignment Problem in Dense Object Detection
Rethinking the Misalignment Problem in Dense Object Detection
Yang Yang
Min Li
Bo Meng
Junxing Ren
Degang Sun
Zihao Huang
26
4
0
27 Aug 2021
Binocular Mutual Learning for Improving Few-shot Classification
Binocular Mutual Learning for Improving Few-shot Classification
Ziqi Zhou
Xi Qiu
Jiangtao Xie
Jianan Wu
Chi Zhang
SSL
18
76
0
27 Aug 2021
Efficient training of lightweight neural networks using Online
  Self-Acquired Knowledge Distillation
Efficient training of lightweight neural networks using Online Self-Acquired Knowledge Distillation
Maria Tzelepi
Anastasios Tefas
13
6
0
26 Aug 2021
MvSR-NAT: Multi-view Subset Regularization for Non-Autoregressive
  Machine Translation
MvSR-NAT: Multi-view Subset Regularization for Non-Autoregressive Machine Translation
Pan Xie
Zexian Li
Xiaohui Hu
34
11
0
19 Aug 2021
Towards Efficient and Data Agnostic Image Classification Training
  Pipeline for Embedded Systems
Towards Efficient and Data Agnostic Image Classification Training Pipeline for Embedded Systems
K. Prokofiev
V. Sovrasov
3DH
24
2
0
16 Aug 2021
Multi-granularity for knowledge distillation
Multi-granularity for knowledge distillation
Baitan Shao
Ying Chen
30
3
0
15 Aug 2021
SimCVD: Simple Contrastive Voxel-Wise Representation Distillation for
  Semi-Supervised Medical Image Segmentation
SimCVD: Simple Contrastive Voxel-Wise Representation Distillation for Semi-Supervised Medical Image Segmentation
Chenyu You
Yuan Zhou
Ruihan Zhao
Lawrence H. Staib
James S. Duncan
35
226
0
13 Aug 2021
No-Reference Image Quality Assessment by Hallucinating Pristine Features
No-Reference Image Quality Assessment by Hallucinating Pristine Features
Baoliang Chen
Lingyu Zhu
Chenqi Kong
Hanwei Zhu
Shiqi Wang
Zhu Li
54
27
0
09 Aug 2021
Neighborhood Consensus Contrastive Learning for Backward-Compatible
  Representation
Neighborhood Consensus Contrastive Learning for Backward-Compatible Representation
Shengsen Wu
Liang Chen
Yihang Lou
Yan Bai
Tao Bai
Minghua Deng
Ling-yu Duan
26
8
0
07 Aug 2021
Transferring Knowledge Distillation for Multilingual Social Event Detection
Jiaqian Ren
Hao Peng
Lei Jiang
Jia Wu
Yongxin Tong
Lihong Wang
X. Bai
Bo Wang
Qiang Yang
54
12
0
06 Aug 2021
Online Knowledge Distillation for Efficient Pose Estimation
Online Knowledge Distillation for Efficient Pose Estimation
Zheng Li
Jingwen Ye
Xiuming Zhang
Ying Huang
Zhigeng Pan
28
94
0
04 Aug 2021
QuPeD: Quantized Personalization via Distillation with Applications to
  Federated Learning
QuPeD: Quantized Personalization via Distillation with Applications to Federated Learning
Kaan Ozkara
Navjot Singh
Deepesh Data
Suhas Diggavi
FedML
MQ
29
56
0
29 Jul 2021
Generalizing Gaze Estimation with Outlier-guided Collaborative
  Adaptation
Generalizing Gaze Estimation with Outlier-guided Collaborative Adaptation
Yunfei Liu
Ruicong Liu
Haofei Wang
Feng Lu
OOD
23
54
0
29 Jul 2021
Open-Ended Learning Leads to Generally Capable Agents
Open-Ended Learning Leads to Generally Capable Agents
Open-Ended Learning Team
Adam Stooke
Anuj Mahajan
Catarina Barros
Charlie Deck
...
Nicolas Porcel
Roberta Raileanu
Steph Hughes-Fitt
Valentin Dalibard
Wojciech M. Czarnecki
55
181
0
27 Jul 2021
Preliminary Steps Towards Federated Sentiment Classification
Preliminary Steps Towards Federated Sentiment Classification
Xin-Chun Li
Lan Li
De-Chuan Zhan
Yunfeng Shao
Bingshuai Li
Shaoming Song
39
1
0
26 Jul 2021
ROD: Reception-aware Online Distillation for Sparse Graphs
ROD: Reception-aware Online Distillation for Sparse Graphs
Wentao Zhang
Yuezihan Jiang
Yang Li
Zeang Sheng
Yu Shen
Xupeng Miao
Liang Wang
Zhi-Xin Yang
Bin Cui
24
24
0
25 Jul 2021
Modality-aware Mutual Learning for Multi-modal Medical Image
  Segmentation
Modality-aware Mutual Learning for Multi-modal Medical Image Segmentation
Yao Zhang
Jiawei Yang
Jiang Tian
Zhongchao Shi
Cheng Zhong
Yang Zhang
Zhiqiang He
37
90
0
21 Jul 2021
Follow Your Path: a Progressive Method for Knowledge Distillation
Follow Your Path: a Progressive Method for Knowledge Distillation
Wenxian Shi
Yuxuan Song
Hao Zhou
Bohan Li
Lei Li
17
15
0
20 Jul 2021
Unpaired cross-modality educed distillation (CMEDL) for medical image
  segmentation
Unpaired cross-modality educed distillation (CMEDL) for medical image segmentation
Jue Jiang
A. Rimner
Joseph O. Deasy
Harini Veeraraghavan
19
20
0
16 Jul 2021
Align before Fuse: Vision and Language Representation Learning with
  Momentum Distillation
Align before Fuse: Vision and Language Representation Learning with Momentum Distillation
Junnan Li
Ramprasaath R. Selvaraju
Akhilesh Deepak Gotmare
Shafiq Joty
Caiming Xiong
Guosheng Lin
FaML
85
1,893
0
16 Jul 2021
Weight Reparametrization for Budget-Aware Network Pruning
Weight Reparametrization for Budget-Aware Network Pruning
Robin Dupont
H. Sahbi
Guillaume Michel
26
1
0
08 Jul 2021
Novel Visual Category Discovery with Dual Ranking Statistics and Mutual
  Knowledge Distillation
Novel Visual Category Discovery with Dual Ranking Statistics and Mutual Knowledge Distillation
Bingchen Zhao
Kai Han
26
107
0
07 Jul 2021
VidLanKD: Improving Language Understanding via Video-Distilled Knowledge
  Transfer
VidLanKD: Improving Language Understanding via Video-Distilled Knowledge Transfer
Zineng Tang
Jaemin Cho
Hao Tan
Joey Tianyi Zhou
VLM
38
29
0
06 Jul 2021
Revisiting Knowledge Distillation: An Inheritance and Exploration
  Framework
Revisiting Knowledge Distillation: An Inheritance and Exploration Framework
Zhen Huang
Xu Shen
Jun Xing
Tongliang Liu
Xinmei Tian
Houqiang Li
Bing Deng
Jianqiang Huang
Xiansheng Hua
38
27
0
01 Jul 2021
R-Drop: Regularized Dropout for Neural Networks
R-Drop: Regularized Dropout for Neural Networks
Xiaobo Liang
Lijun Wu
Juntao Li
Yue Wang
Qi Meng
Tao Qin
Wei Chen
Hao Fei
Tie-Yan Liu
49
424
0
28 Jun 2021
Attention-guided Progressive Mapping for Profile Face Recognition
Attention-guided Progressive Mapping for Profile Face Recognition
Junyang Huang
Changxing Ding
CVBM
18
4
0
27 Jun 2021
Simple Distillation Baselines for Improving Small Self-supervised Models
Simple Distillation Baselines for Improving Small Self-supervised Models
Jindong Gu
Wei Liu
Yonglong Tian
27
8
0
21 Jun 2021
Collaborative Training of Acoustic Encoders for Speech Recognition
Collaborative Training of Acoustic Encoders for Speech Recognition
Varun K. Nagaraja
Yangyang Shi
Ganesh Venkatesh
Ozlem Kalinli
M. Seltzer
Vikas Chandra
48
11
0
16 Jun 2021
Energy-efficient Knowledge Distillation for Spiking Neural Networks
Dongjin Lee
Seongsik Park
Jongwan Kim
Wuhyeong Doh
Sungroh Yoon
31
11
0
14 Jun 2021
Previous
123...8910...131415
Next