ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2008.06814
  4. Cited By
Cascaded channel pruning using hierarchical self-distillation

Cascaded channel pruning using hierarchical self-distillation

16 August 2020
Roy Miles
K. Mikolajczyk
ArXivPDFHTML

Papers citing "Cascaded channel pruning using hierarchical self-distillation"

3 / 3 papers shown
Title
Learning to Project for Cross-Task Knowledge Distillation
Learning to Project for Cross-Task Knowledge Distillation
Dylan Auty
Roy Miles
Benedikt Kolbeinsson
K. Mikolajczyk
40
0
0
21 Mar 2024
Information Theoretic Representation Distillation
Information Theoretic Representation Distillation
Roy Miles
Adrian Lopez-Rodriguez
K. Mikolajczyk
MQ
13
21
0
01 Dec 2021
Deep Ordinal Regression Network for Monocular Depth Estimation
Deep Ordinal Regression Network for Monocular Depth Estimation
Huan Fu
Biwei Huang
Chaohui Wang
Kayhan Batmanghelich
Dacheng Tao
MDE
200
1,708
0
06 Jun 2018
1