ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2202.07846
  4. Cited By
Knowledge Distillation with Deep Supervision
v1v2 (latest)

Knowledge Distillation with Deep Supervision

16 February 2022
Shiya Luo
Defang Chen
Can Wang
ArXiv (abs)PDFHTMLGithub (1★)

Papers citing "Knowledge Distillation with Deep Supervision"

4 / 4 papers shown
Title
A Layered Self-Supervised Knowledge Distillation Framework for Efficient Multimodal Learning on the Edge
A Layered Self-Supervised Knowledge Distillation Framework for Efficient Multimodal Learning on the Edge
Tarique Dahri
Zulfiqar Ali Memon
Zhenyu Yu
Mohd Yamani Idna Idris
Sheheryar Khan
Sadiq Ahmad
Maged Shoman
Saddam Aziz
Rizwan Qureshi
20
0
0
08 Jun 2025
Rethinking Intermediate Layers design in Knowledge Distillation for
  Kidney and Liver Tumor Segmentation
Rethinking Intermediate Layers design in Knowledge Distillation for Kidney and Liver Tumor Segmentation
Vandan Gorade
Sparsh Mittal
Debesh Jha
Ulas Bagci
62
3
0
28 Nov 2023
Contrastive Representation Distillation
Contrastive Representation Distillation
Yonglong Tian
Dilip Krishnan
Phillip Isola
207
1,057
0
23 Oct 2019
Xception: Deep Learning with Depthwise Separable Convolutions
Xception: Deep Learning with Depthwise Separable Convolutions
François Chollet
MDEBDLPINN
1.6K
14,656
0
07 Oct 2016
1