ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2109.03075
  4. Cited By
Knowledge Distillation Using Hierarchical Self-Supervision Augmented
  Distribution

Knowledge Distillation Using Hierarchical Self-Supervision Augmented Distribution

7 September 2021
Chuanguang Yang
Zhulin An
Linhang Cai
Yongjun Xu
ArXivPDFHTML

Papers citing "Knowledge Distillation Using Hierarchical Self-Supervision Augmented Distribution"

13 / 13 papers shown
Title
Cross-Image Relational Knowledge Distillation for Semantic Segmentation
Cross-Image Relational Knowledge Distillation for Semantic Segmentation
Chuanguang Yang
Helong Zhou
Zhulin An
Xue Jiang
Yong Xu
Qian Zhang
79
170
0
14 Apr 2022
Student Network Learning via Evolutionary Knowledge Distillation
Student Network Learning via Evolutionary Knowledge Distillation
Kangkai Zhang
Chunhui Zhang
Shikun Li
Dan Zeng
Shiming Ge
46
83
0
23 Mar 2021
Knowledge Distillation Meets Self-Supervision
Knowledge Distillation Meets Self-Supervision
Guodong Xu
Ziwei Liu
Xiaoxiao Li
Chen Change Loy
FedML
64
281
0
12 Jun 2020
Heterogeneous Knowledge Distillation using Information Flow Modeling
Heterogeneous Knowledge Distillation using Information Flow Modeling
Nikolaos Passalis
Maria Tzelepi
Anastasios Tefas
44
139
0
02 May 2020
Video Cloze Procedure for Self-Supervised Spatio-Temporal Learning
Video Cloze Procedure for Self-Supervised Spatio-Temporal Learning
Dezhao Luo
Chang-rui Liu
Yu Zhou
Dongbao Yang
Can Ma
QiXiang Ye
Weiping Wang
SSL
31
160
0
02 Jan 2020
Gated Convolutional Networks with Hybrid Connectivity for Image
  Classification
Gated Convolutional Networks with Hybrid Connectivity for Image Classification
Chuanguang Yang
Zhulin An
Hui Zhu
Xiaolong Hu
Boyu Diao
Kaiqiang Xu
Chao Li
Yongjun Xu
39
51
0
26 Aug 2019
Similarity-Preserving Knowledge Distillation
Similarity-Preserving Knowledge Distillation
Frederick Tung
Greg Mori
86
963
0
23 Jul 2019
Deep High-Resolution Representation Learning for Human Pose Estimation
Deep High-Resolution Representation Learning for Human Pose Estimation
Ke Sun
Bin Xiao
Dong Liu
Jingdong Wang
3DV
95
4,024
0
25 Feb 2019
ShuffleNet V2: Practical Guidelines for Efficient CNN Architecture
  Design
ShuffleNet V2: Practical Guidelines for Efficient CNN Architecture Design
Ningning Ma
Xiangyu Zhang
Haitao Zheng
Jian Sun
122
4,957
0
30 Jul 2018
Paying More Attention to Attention: Improving the Performance of
  Convolutional Neural Networks via Attention Transfer
Paying More Attention to Attention: Improving the Performance of Convolutional Neural Networks via Attention Transfer
Sergey Zagoruyko
N. Komodakis
92
2,561
0
12 Dec 2016
Pyramid Scene Parsing Network
Pyramid Scene Parsing Network
Hengshuang Zhao
Jianping Shi
Xiaojuan Qi
Xiaogang Wang
Jiaya Jia
VOS
SSeg
276
11,941
0
04 Dec 2016
FitNets: Hints for Thin Deep Nets
FitNets: Hints for Thin Deep Nets
Adriana Romero
Nicolas Ballas
Samira Ebrahimi Kahou
Antoine Chassang
C. Gatta
Yoshua Bengio
FedML
214
3,862
0
19 Dec 2014
Deeply-Supervised Nets
Deeply-Supervised Nets
Chen-Yu Lee
Saining Xie
Patrick W. Gallagher
Zhengyou Zhang
Zhuowen Tu
233
2,229
0
18 Sep 2014
1