Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2208.08741
Cited By
Quantifying the Knowledge in a DNN to Explain Knowledge Distillation for Classification
18 August 2022
Quanshi Zhang
Xu Cheng
Yilan Chen
Zhefan Rao
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Quantifying the Knowledge in a DNN to Explain Knowledge Distillation for Classification"
8 / 8 papers shown
Title
Contrastive Representation Distillation via Multi-Scale Feature Decoupling
Cuipeng Wang
Tieyuan Chen
Haipeng Wang
54
0
0
09 Feb 2025
Hardware-Aware DNN Compression for Homogeneous Edge Devices
Kunlong Zhang
Guiying Li
Ning Lu
Peng Yang
K. Tang
54
0
0
28 Jan 2025
Quantifying Knowledge Distillation Using Partial Information Decomposition
Pasan Dissanayake
Faisal Hamman
Barproda Halder
Ilia Sucholutsky
Qiuyi Zhang
Sanghamitra Dutta
36
0
0
12 Nov 2024
Understanding the Effects of Projectors in Knowledge Distillation
Yudong Chen
Sen Wang
Jiajun Liu
Xuwei Xu
Frank de Hoog
Brano Kusy
Zi Huang
26
0
0
26 Oct 2023
PointNet: Deep Learning on Point Sets for 3D Classification and Segmentation
C. Qi
Hao Su
Kaichun Mo
Leonidas J. Guibas
3DH
3DPC
3DV
PINN
222
14,103
0
02 Dec 2016
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
N. Keskar
Dheevatsa Mudigere
J. Nocedal
M. Smelyanskiy
P. T. P. Tang
ODL
308
2,890
0
15 Sep 2016
Norm-Based Capacity Control in Neural Networks
Behnam Neyshabur
Ryota Tomioka
Nathan Srebro
125
577
0
27 Feb 2015
ImageNet Large Scale Visual Recognition Challenge
Olga Russakovsky
Jia Deng
Hao Su
J. Krause
S. Satheesh
...
A. Karpathy
A. Khosla
Michael S. Bernstein
Alexander C. Berg
Li Fei-Fei
VLM
ObjD
296
39,198
0
01 Sep 2014
1