ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2210.16103
  4. Cited By
Collaborative Multi-Teacher Knowledge Distillation for Learning Low
  Bit-width Deep Neural Networks

Collaborative Multi-Teacher Knowledge Distillation for Learning Low Bit-width Deep Neural Networks

27 October 2022
Cuong Pham
Tuan Hoang
Thanh-Toan Do
    FedML
    MQ
ArXivPDFHTML

Papers citing "Collaborative Multi-Teacher Knowledge Distillation for Learning Low Bit-width Deep Neural Networks"

2 / 2 papers shown
Title
Heuristic-Free Multi-Teacher Learning
Heuristic-Free Multi-Teacher Learning
Huy Thong Nguyen
En-Hung Chu
Lenord Melvix
Jazon Jiao
Chunglin Wen
Benjamin Louie
77
0
0
19 Nov 2024
ImageNet Large Scale Visual Recognition Challenge
ImageNet Large Scale Visual Recognition Challenge
Olga Russakovsky
Jia Deng
Hao Su
J. Krause
S. Satheesh
...
A. Karpathy
A. Khosla
Michael S. Bernstein
Alexander C. Berg
Li Fei-Fei
VLM
ObjD
296
39,198
0
01 Sep 2014
1