ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2006.16589
  4. Cited By
On the Demystification of Knowledge Distillation: A Residual Network
  Perspective

On the Demystification of Knowledge Distillation: A Residual Network Perspective

30 June 2020
N. Jha
Rajat Saini
Sparsh Mittal
ArXiv (abs)PDFHTML

Papers citing "On the Demystification of Knowledge Distillation: A Residual Network Perspective"

19 / 19 papers shown
Title
Towards Understanding Knowledge Distillation
Towards Understanding Knowledge Distillation
Mary Phuong
Christoph H. Lampert
67
322
0
27 May 2021
Heterogeneous Knowledge Distillation using Information Flow Modeling
Heterogeneous Knowledge Distillation using Information Flow Modeling
Nikolaos Passalis
Maria Tzelepi
Anastasios Tefas
73
139
0
02 May 2020
Understanding and Improving Knowledge Distillation
Understanding and Improving Knowledge Distillation
Jiaxi Tang
Rakesh Shivanna
Zhe Zhao
Dong Lin
Anima Singh
Ed H. Chi
Sagar Jain
88
133
0
10 Feb 2020
Search to Distill: Pearls are Everywhere but not the Eyes
Search to Distill: Pearls are Everywhere but not the Eyes
Yu Liu
Xuhui Jia
Mingxing Tan
Raviteja Vemulapalli
Yukun Zhu
Bradley Green
Xiaogang Wang
88
68
0
20 Nov 2019
Contrastive Representation Distillation
Contrastive Representation Distillation
Yonglong Tian
Dilip Krishnan
Phillip Isola
165
1,053
0
23 Oct 2019
On the Efficacy of Knowledge Distillation
On the Efficacy of Knowledge Distillation
Ligang He
Rui Mao
98
618
0
03 Oct 2019
Towards Understanding the Importance of Shortcut Connections in Residual
  Networks
Towards Understanding the Importance of Shortcut Connections in Residual Networks
Tianyi Liu
Minshuo Chen
Mo Zhou
S. Du
Enlu Zhou
T. Zhao
31
45
0
10 Sep 2019
DropBlock: A regularization method for convolutional networks
DropBlock: A regularization method for convolutional networks
Golnaz Ghiasi
Nayeon Lee
Quoc V. Le
115
914
0
30 Oct 2018
Knowledge Distillation in Generations: More Tolerant Teachers Educate
  Better Students
Knowledge Distillation in Generations: More Tolerant Teachers Educate Better Students
Chenglin Yang
Lingxi Xie
Siyuan Qiao
Alan Yuille
70
136
0
15 May 2018
Visualizing the Loss Landscape of Neural Nets
Visualizing the Loss Landscape of Neural Nets
Hao Li
Zheng Xu
Gavin Taylor
Christoph Studer
Tom Goldstein
260
1,898
0
28 Dec 2017
Deep Mutual Learning
Deep Mutual Learning
Ying Zhang
Tao Xiang
Timothy M. Hospedales
Huchuan Lu
FedML
153
1,654
0
01 Jun 2017
The Shattered Gradients Problem: If resnets are the answer, then what is
  the question?
The Shattered Gradients Problem: If resnets are the answer, then what is the question?
David Balduzzi
Marcus Frean
Lennox Leary
J. P. Lewis
Kurt Wan-Duo Ma
Brian McWilliams
ODL
73
406
0
28 Feb 2017
Aggregated Residual Transformations for Deep Neural Networks
Aggregated Residual Transformations for Deep Neural Networks
Saining Xie
Ross B. Girshick
Piotr Dollár
Zhuowen Tu
Kaiming He
522
10,347
0
16 Nov 2016
Wide Residual Networks
Wide Residual Networks
Sergey Zagoruyko
N. Komodakis
353
8,000
0
23 May 2016
Deep Roots: Improving CNN Efficiency with Hierarchical Filter Groups
Deep Roots: Improving CNN Efficiency with Hierarchical Filter Groups
Yani Andrew Ioannou
D. Robertson
R. Cipolla
A. Criminisi
80
265
0
20 May 2016
Deep Networks with Stochastic Depth
Deep Networks with Stochastic Depth
Gao Huang
Yu Sun
Zhuang Liu
Daniel Sedra
Kilian Q. Weinberger
215
2,361
0
30 Mar 2016
Identity Mappings in Deep Residual Networks
Identity Mappings in Deep Residual Networks
Kaiming He
Xinming Zhang
Shaoqing Ren
Jian Sun
354
10,196
0
16 Mar 2016
Unifying distillation and privileged information
Unifying distillation and privileged information
David Lopez-Paz
Léon Bottou
Bernhard Schölkopf
V. Vapnik
FedML
169
463
0
11 Nov 2015
FitNets: Hints for Thin Deep Nets
FitNets: Hints for Thin Deep Nets
Adriana Romero
Nicolas Ballas
Samira Ebrahimi Kahou
Antoine Chassang
C. Gatta
Yoshua Bengio
FedML
319
3,898
0
19 Dec 2014
1