Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1910.04641
Cited By
Cross-modal knowledge distillation for action recognition
10 October 2019
Fida Mohammad Thoker
Juergen Gall
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Cross-modal knowledge distillation for action recognition"
9 / 9 papers shown
Title
Leveraging Topological Guidance for Improved Knowledge Distillation
Eun Som Jeon
Rahul Khurana
Aishani Pathak
P. Turaga
49
0
0
07 Jul 2024
VideoAdviser: Video Knowledge Distillation for Multimodal Transfer Learning
Yanan Wang
Donghuo Zeng
Shinya Wada
Satoshi Kurihara
32
6
0
27 Sep 2023
Audio Representation Learning by Distilling Video as Privileged Information
Amirhossein Hajavi
Ali Etemad
21
4
0
06 Feb 2023
Enabling All In-Edge Deep Learning: A Literature Review
Praveen Joshi
Mohammed Hasanuzzaman
Chandra Thapa
Haithem Afli
T. Scully
31
22
0
07 Apr 2022
Distillation of Human-Object Interaction Contexts for Action Recognition
Muna Almushyti
Frederick W. Li
34
3
0
17 Dec 2021
EvDistill: Asynchronous Events to End-task Learning via Bidirectional Reconstruction-guided Cross-modal Knowledge Distillation
Lin Wang
Yujeong Chae
Sung-Hoon Yoon
Tae-Kyun Kim
Kuk-Jin Yoon
36
64
0
24 Nov 2021
DnS: Distill-and-Select for Efficient and Accurate Video Indexing and Retrieval
Giorgos Kordopatis-Zilos
Christos Tzelepis
Symeon Papadopoulos
I. Kompatsiaris
Ioannis Patras
27
33
0
24 Jun 2021
End-to-End Automatic Speech Recognition with Deep Mutual Learning
Ryo Masumura
Mana Ihori
Akihiko Takashima
Tomohiro Tanaka
Takanori Ashihara
22
5
0
16 Feb 2021
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
19
2,843
0
09 Jun 2020
1