ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2412.15224
  4. Cited By
Multi-Branch Mutual-Distillation Transformer for EEG-Based Seizure
  Subtype Classification

Multi-Branch Mutual-Distillation Transformer for EEG-Based Seizure Subtype Classification

4 December 2024
Ruimin Peng
Zhenbang Du
Changming Zhao
Jingwei Luo
Wenzhong Liu
Xinxing Chen
Dongrui Wu
    MedIm
ArXivPDFHTML

Papers citing "Multi-Branch Mutual-Distillation Transformer for EEG-Based Seizure Subtype Classification"

3 / 3 papers shown
Title
Self-distillation with Batch Knowledge Ensembling Improves ImageNet
  Classification
Self-distillation with Batch Knowledge Ensembling Improves ImageNet Classification
Yixiao Ge
Xiao Zhang
Ching Lam Choi
Ka Chun Cheung
Peipei Zhao
Feng Zhu
Xiaogang Wang
Rui Zhao
Hongsheng Li
FedML
UQCV
140
25
0
27 Apr 2021
What is the State of Neural Network Pruning?
What is the State of Neural Network Pruning?
Davis W. Blalock
Jose Javier Gonzalez Ortiz
Jonathan Frankle
John Guttag
191
1,032
0
06 Mar 2020
Knowledge Distillation by On-the-Fly Native Ensemble
Knowledge Distillation by On-the-Fly Native Ensemble
Xu Lan
Xiatian Zhu
S. Gong
212
474
0
12 Jun 2018
1