Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2106.10885
Cited By
Knowledge Distillation via Instance-level Sequence Learning
21 June 2021
Haoran Zhao
Xin Sun
Junyu Dong
Zihe Dong
Qiong Li
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Knowledge Distillation via Instance-level Sequence Learning"
5 / 5 papers shown
Title
sDREAMER: Self-distilled Mixture-of-Modality-Experts Transformer for Automatic Sleep Staging
Jingyuan Chen
Yuan Yao
Mie Anderson
Natalie Hauglund
Celia Kjaerby
Verena Untiet
Maiken Nedergaard
Jiebo Luo
49
1
0
28 Jan 2025
Confidence-Aware Paced-Curriculum Learning by Label Smoothing for Surgical Scene Understanding
Mengya Xu
Mobarakol Islam
Ben Glocker
Hongliang Ren
31
1
0
22 Dec 2022
Curriculum Temperature for Knowledge Distillation
Zheng Li
Xiang Li
Lingfeng Yang
Borui Zhao
Renjie Song
Lei Luo
Jun Yu Li
Jian Yang
33
133
0
29 Nov 2022
Adaptive Mixing of Auxiliary Losses in Supervised Learning
D. Sivasubramanian
Ayush Maheshwari
Pradeep Shenoy
A. Prathosh
Ganesh Ramakrishnan
29
5
0
07 Feb 2022
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications
Andrew G. Howard
Menglong Zhu
Bo Chen
Dmitry Kalenichenko
Weijun Wang
Tobias Weyand
M. Andreetto
Hartwig Adam
3DH
950
20,572
0
17 Apr 2017
1