Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2412.15224
Cited By
Multi-Branch Mutual-Distillation Transformer for EEG-Based Seizure Subtype Classification
4 December 2024
Ruimin Peng
Zhenbang Du
Changming Zhao
Jingwei Luo
Wenzhong Liu
Xinxing Chen
Dongrui Wu
MedIm
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Multi-Branch Mutual-Distillation Transformer for EEG-Based Seizure Subtype Classification"
3 / 3 papers shown
Title
Self-distillation with Batch Knowledge Ensembling Improves ImageNet Classification
Yixiao Ge
Xiao Zhang
Ching Lam Choi
Ka Chun Cheung
Peipei Zhao
Feng Zhu
Xiaogang Wang
Rui Zhao
Hongsheng Li
FedML
UQCV
140
25
0
27 Apr 2021
What is the State of Neural Network Pruning?
Davis W. Blalock
Jose Javier Gonzalez Ortiz
Jonathan Frankle
John Guttag
191
1,032
0
06 Mar 2020
Knowledge Distillation by On-the-Fly Native Ensemble
Xu Lan
Xiatian Zhu
S. Gong
212
474
0
12 Jun 2018
1