ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1612.00478
  4. Cited By
In Teacher We Trust: Learning Compressed Models for Pedestrian Detection

In Teacher We Trust: Learning Compressed Models for Pedestrian Detection

1 December 2016
Jonathan Shen
Noranart Vesdapunt
Vishnu Boddeti
Kris Kitani
ArXivPDFHTML

Papers citing "In Teacher We Trust: Learning Compressed Models for Pedestrian Detection"

5 / 5 papers shown
Title
MRI-based Alzheimer's disease prediction via distilling the knowledge in
  multi-modal data
MRI-based Alzheimer's disease prediction via distilling the knowledge in multi-modal data
Hao Guan
Chaoyue Wang
Dacheng Tao
18
30
0
08 Apr 2021
Knowledge Distillation: A Survey
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
19
2,843
0
09 Jun 2020
Compact recurrent neural networks for acoustic event detection on
  low-energy low-complexity platforms
Compact recurrent neural networks for acoustic event detection on low-energy low-complexity platforms
G. Cerutti
Rahul Prasad
Alessio Brutti
Elisabetta Farella
21
47
0
29 Jan 2020
Knowledge Distillation for End-to-End Person Search
Knowledge Distillation for End-to-End Person Search
Bharti Munjal
Fabio Galasso
S. Amin
FedML
40
15
0
03 Sep 2019
Dropout as a Bayesian Approximation: Representing Model Uncertainty in
  Deep Learning
Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning
Y. Gal
Zoubin Ghahramani
UQCV
BDL
285
9,145
0
06 Jun 2015
1