Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2202.07846
Cited By
v1
v2 (latest)
Knowledge Distillation with Deep Supervision
16 February 2022
Shiya Luo
Defang Chen
Can Wang
Re-assign community
ArXiv (abs)
PDF
HTML
Github (1★)
Papers citing
"Knowledge Distillation with Deep Supervision"
4 / 4 papers shown
Title
A Layered Self-Supervised Knowledge Distillation Framework for Efficient Multimodal Learning on the Edge
Tarique Dahri
Zulfiqar Ali Memon
Zhenyu Yu
Mohd Yamani Idna Idris
Sheheryar Khan
Sadiq Ahmad
Maged Shoman
Saddam Aziz
Rizwan Qureshi
20
0
0
08 Jun 2025
Rethinking Intermediate Layers design in Knowledge Distillation for Kidney and Liver Tumor Segmentation
Vandan Gorade
Sparsh Mittal
Debesh Jha
Ulas Bagci
62
3
0
28 Nov 2023
Contrastive Representation Distillation
Yonglong Tian
Dilip Krishnan
Phillip Isola
207
1,057
0
23 Oct 2019
Xception: Deep Learning with Depthwise Separable Convolutions
François Chollet
MDE
BDL
PINN
1.6K
14,656
0
07 Oct 2016
1