ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2311.05970
  4. Cited By
Quantized Distillation: Optimizing Driver Activity Recognition Models
  for Resource-Constrained Environments

Quantized Distillation: Optimizing Driver Activity Recognition Models for Resource-Constrained Environments

10 November 2023
Calvin Tanama
Kunyu Peng
Zdravko Marinov
Rainer Stiefelhagen
Alina Roitberg
ArXivPDFHTML

Papers citing "Quantized Distillation: Optimizing Driver Activity Recognition Models for Resource-Constrained Environments"

5 / 5 papers shown
Title
Overcoming Oscillations in Quantization-Aware Training
Overcoming Oscillations in Quantization-Aware Training
Markus Nagel
Marios Fournarakis
Yelysei Bondarenko
Tijmen Blankevoort
MQ
111
101
0
21 Mar 2022
TransKD: Transformer Knowledge Distillation for Efficient Semantic
  Segmentation
TransKD: Transformer Knowledge Distillation for Efficient Semantic Segmentation
R. Liu
Kailun Yang
Alina Roitberg
Jiaming Zhang
Kunyu Peng
Huayao Liu
Yaonan Wang
Rainer Stiefelhagen
ViT
47
36
0
27 Feb 2022
FBGEMM: Enabling High-Performance Low-Precision Deep Learning Inference
FBGEMM: Enabling High-Performance Low-Precision Deep Learning Inference
D. Khudia
Jianyu Huang
Protonu Basu
Summer Deng
Haixin Liu
Jongsoo Park
M. Smelyanskiy
FedML
MQ
51
46
0
13 Jan 2021
What is the State of Neural Network Pruning?
What is the State of Neural Network Pruning?
Davis W. Blalock
Jose Javier Gonzalez Ortiz
Jonathan Frankle
John Guttag
191
1,027
0
06 Mar 2020
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision
  Applications
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications
Andrew G. Howard
Menglong Zhu
Bo Chen
Dmitry Kalenichenko
Weijun Wang
Tobias Weyand
M. Andreetto
Hartwig Adam
3DH
950
20,572
0
17 Apr 2017
1