ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1812.02699
  4. Cited By
Online Model Distillation for Efficient Video Inference
v1v2 (latest)

Online Model Distillation for Efficient Video Inference

6 December 2018
Ravi Teja Mullapudi
Steven Chen
Keyi Zhang
Deva Ramanan
Kayvon Fatahalian
    VGen
ArXiv (abs)PDFHTML

Papers citing "Online Model Distillation for Efficient Video Inference"

16 / 66 papers shown
Title
It's always personal: Using Early Exits for Efficient On-Device CNN
  Personalisation
It's always personal: Using Early Exits for Efficient On-Device CNN Personalisation
Ilias Leontiadis
Stefanos Laskaridis
Stylianos I. Venieris
Nicholas D. Lane
123
30
0
02 Feb 2021
Ekya: Continuous Learning of Video Analytics Models on Edge Compute
  Servers
Ekya: Continuous Learning of Video Analytics Models on Edge Compute Servers
Romil Bhardwaj
Zhengxu Xia
Ganesh Ananthanarayanan
Junchen Jiang
Nikolaos Karianakis
Yuanchao Shu
Kevin Hsieh
P. Bahl
Ion Stoica
79
169
0
19 Dec 2020
ODIN: Automated Drift Detection and Recovery in Video Analytics
ODIN: Automated Drift Detection and Recovery in Video Analytics
Abhijit Suprem
Joy Arulraj
C. Pu
J. E. Ferreira
58
12
0
09 Sep 2020
Self-Supervised Policy Adaptation during Deployment
Self-Supervised Policy Adaptation during Deployment
Nicklas Hansen
Rishabh Jangir
Yu Sun
Guillem Alenyà
Pieter Abbeel
Alexei A. Efros
Lerrel Pinto
Xiaolong Wang
103
163
0
08 Jul 2020
Space-Time Correspondence as a Contrastive Random Walk
Space-Time Correspondence as a Contrastive Random Walk
Allan Jabri
Andrew Owens
Alexei A. Efros
SSLOT
146
304
0
25 Jun 2020
Real-Time Video Inference on Edge Devices via Adaptive Model Streaming
Real-Time Video Inference on Edge Devices via Adaptive Model Streaming
Mehrdad Khani Shirkoohi
Pouya Hamadanian
Arash Nasr-Esfahany
Mohammad Alizadeh
110
47
0
11 Jun 2020
Knowledge Distillation: A Survey
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
291
3,026
0
09 Jun 2020
Towards Streaming Perception
Towards Streaming Perception
Mengtian Li
Yu-Xiong Wang
Deva Ramanan
79
5
0
21 May 2020
Interactive Video Stylization Using Few-Shot Patch-Based Training
Interactive Video Stylization Using Few-Shot Patch-Based Training
Ondrej Texler
David Futschik
Michal Kučera
Ondrej Jamriska
Sárka Sochorová
Menglei Chai
Sergey Tulyakov
Daniel Sýkora
VGen
72
81
0
29 Apr 2020
Enabling Incremental Knowledge Transfer for Object Detection at the Edge
Enabling Incremental Knowledge Transfer for Object Detection at the Edge
Mohammad Farhadi Bajestani
Mehdi Ghasemi
S. Vrudhula
Yezhou Yang
53
14
0
13 Apr 2020
Neural Networks Are More Productive Teachers Than Human Raters: Active
  Mixup for Data-Efficient Knowledge Distillation from a Blackbox Model
Neural Networks Are More Productive Teachers Than Human Raters: Active Mixup for Data-Efficient Knowledge Distillation from a Blackbox Model
Dongdong Wang
Yandong Li
Liqiang Wang
Boqing Gong
76
49
0
31 Mar 2020
Fast and Three-rious: Speeding Up Weak Supervision with Triplet Methods
Fast and Three-rious: Speeding Up Weak Supervision with Triplet Methods
Daniel Y. Fu
Mayee F. Chen
Frederic Sala
Sarah Hooper
Kayvon Fatahalian
Christopher Ré
OffRL
110
117
0
27 Feb 2020
Side-Tuning: A Baseline for Network Adaptation via Additive Side
  Networks
Side-Tuning: A Baseline for Network Adaptation via Additive Side Networks
Jeffrey O. Zhang
Alexander Sax
Amir Zamir
Leonidas Guibas
Jitendra Malik
91
28
0
31 Dec 2019
Test-Time Training with Self-Supervision for Generalization under
  Distribution Shifts
Test-Time Training with Self-Supervision for Generalization under Distribution Shifts
Yu Sun
Xiaolong Wang
Zhuang Liu
John Miller
Alexei A. Efros
Moritz Hardt
TTAOOD
99
96
0
29 Sep 2019
TKD: Temporal Knowledge Distillation for Active Perception
TKD: Temporal Knowledge Distillation for Active Perception
Mohammad Farhadi
Yezhou Yang
54
25
0
04 Mar 2019
Dataset Culling: Towards Efficient Training Of Distillation-Based Domain
  Specific Models
Dataset Culling: Towards Efficient Training Of Distillation-Based Domain Specific Models
Kentaro Yoshioka
Edward H. Lee
S. Wong
M. Horowitz
23
2
0
01 Feb 2019
Previous
12