ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1911.06786
  4. Cited By
Data Efficient Stagewise Knowledge Distillation
v1v2v3 (latest)

Data Efficient Stagewise Knowledge Distillation

15 November 2019
Akshay Ravindra Kulkarni
Navid Panchi
Sharath Chandra Raparthy
Shital S. Chiddarwar
ArXiv (abs)PDFHTML

Papers citing "Data Efficient Stagewise Knowledge Distillation"

1 / 1 papers shown
Title
Parallel Blockwise Knowledge Distillation for Deep Neural Network
  Compression
Parallel Blockwise Knowledge Distillation for Deep Neural Network Compression
Cody Blakeney
Xiaomin Li
Yan Yan
Ziliang Zong
93
41
0
05 Dec 2020
1