ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2104.09044
  4. Cited By
Distilling Knowledge via Knowledge Review

Distilling Knowledge via Knowledge Review

19 April 2021
Pengguang Chen
Shu Liu
Hengshuang Zhao
Jiaya Jia
ArXiv (abs)PDFHTMLGithub (272★)

Papers citing "Distilling Knowledge via Knowledge Review"

15 / 215 papers shown
Title
Knowledge Distillation with the Reused Teacher Classifier
Knowledge Distillation with the Reused Teacher Classifier
Defang Chen
Jianhan Mei
Hailin Zhang
C. Wang
Yan Feng
Chun-Yen Chen
99
172
0
26 Mar 2022
Decoupled Knowledge Distillation
Decoupled Knowledge Distillation
Borui Zhao
Quan Cui
Renjie Song
Yiyu Qiu
Jiajun Liang
96
555
0
16 Mar 2022
TransKD: Transformer Knowledge Distillation for Efficient Semantic
  Segmentation
TransKD: Transformer Knowledge Distillation for Efficient Semantic Segmentation
R. Liu
Kailun Yang
Alina Roitberg
Jiaming Zhang
Kunyu Peng
Huayao Liu
Yaonan Wang
Rainer Stiefelhagen
ViT
91
38
0
27 Feb 2022
MonoDistill: Learning Spatial Features for Monocular 3D Object Detection
MonoDistill: Learning Spatial Features for Monocular 3D Object Detection
Zhiyu Chong
Xinzhu Ma
Hong Zhang
Yuxin Yue
Haojie Li
Zhihui Wang
Wanli Ouyang
3DPC
177
102
0
26 Jan 2022
Anomaly Detection via Reverse Distillation from One-Class Embedding
Anomaly Detection via Reverse Distillation from One-Class Embedding
Hanqiu Deng
Xingyu Li
UQCV
204
490
0
26 Jan 2022
Pixel Distillation: A New Knowledge Distillation Scheme for
  Low-Resolution Image Recognition
Pixel Distillation: A New Knowledge Distillation Scheme for Low-Resolution Image Recognition
Guangyu Guo
Dingwen Zhang
Longfei Han
Nian Liu
Ming-Ming Cheng
Junwei Han
61
2
0
17 Dec 2021
Information Theoretic Representation Distillation
Information Theoretic Representation Distillation
Roy Miles
Adrian Lopez-Rodriguez
K. Mikolajczyk
MQ
103
22
0
01 Dec 2021
Oracle Teacher: Leveraging Target Information for Better Knowledge
  Distillation of CTC Models
Oracle Teacher: Leveraging Target Information for Better Knowledge Distillation of CTC Models
J. Yoon
H. Kim
Hyeon Seung Lee
Sunghwan Ahn
N. Kim
62
1
0
05 Nov 2021
Response-based Distillation for Incremental Object Detection
Response-based Distillation for Incremental Object Detection
Tao Feng
Mang Wang
ObjDCLL
110
1
0
26 Oct 2021
Distilling and Transferring Knowledge via cGAN-generated Samples for
  Image Classification and Regression
Distilling and Transferring Knowledge via cGAN-generated Samples for Image Classification and Regression
Xin Ding
Z. J. Wang
Zuheng Xu
Z. Jane Wang
William J. Welch
133
22
0
07 Apr 2021
Distilling a Powerful Student Model via Online Knowledge Distillation
Distilling a Powerful Student Model via Online Knowledge Distillation
Shaojie Li
Mingbao Lin
Yan Wang
Yongjian Wu
Yonghong Tian
Ling Shao
Rongrong Ji
FedML
117
47
0
26 Mar 2021
PURSUhInT: In Search of Informative Hint Points Based on Layer
  Clustering for Knowledge Distillation
PURSUhInT: In Search of Informative Hint Points Based on Layer Clustering for Knowledge Distillation
Reyhan Kevser Keser
Aydin Ayanzadeh
O. A. Aghdam
Çaglar Kilcioglu
B. U. Toreyin
N. K. Üre
53
7
0
26 Feb 2021
Learnable Boundary Guided Adversarial Training
Learnable Boundary Guided Adversarial Training
Jiequan Cui
Shu Liu
Liwei Wang
Jiaya Jia
OODAAML
113
132
0
23 Nov 2020
Adjoined Networks: A Training Paradigm with Applications to Network
  Compression
Adjoined Networks: A Training Paradigm with Applications to Network Compression
Utkarsh Nath
Shrinu Kushagra
Yingzhen Yang
53
2
0
10 Jun 2020
Contrastive Representation Distillation
Contrastive Representation Distillation
Yonglong Tian
Dilip Krishnan
Phillip Isola
211
1,057
0
23 Oct 2019
Previous
12345