ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2405.16090
82
2
v1v2v3 (latest)

EEG-DBNet: A Dual-Branch Network for Motor-Imagery Brain-Computer Interfaces

25 May 2024
Xicheng Lou
Xinwei Li
Hongying Meng
Jun Hu
Meili Xu
Yue Zhao
Jiazhang Yang
ArXiv (abs)PDFHTMLGithub (8★)
Abstract

Motor imagery electroencephalogram (EEG)-based brain-computer interfaces (BCIs) aid individuals with restricted limb mobility. However, challenges like low signal-to-noise ratio and limited spatial resolution hinder accurate feature extraction from EEG signals, impacting classification. To tackle these issues, we propose an end-to-end dual-branch neural network (EEG-DBNet). This network decodes temporal and spectral sequences separately using distinct branches. Each branch has local and global convolution blocks for extracting local and global features. The temporal branch employs three convolutional layers with smaller kernels, fewer channels, and average pooling, while the spectral branch uses larger kernels, more channels, and max pooling. Global convolution blocks then extract comprehensive features. Outputs from both branches are concatenated and fed to fully connected layers for classification. Ablation experiments demonstrate that our architecture, with specialized convolutional parameters for temporal and spectral sequences, significantly improves classification accuracy compared to single-branch structures. The complementary relationship between local and global convolutional blocks compensates for traditional CNNs' limitations in global feature extraction. Our method achieves accuracies of 85.84% and 91.42% on BCI Competition 4-2a and 4-2b datasets, respectively, surpassing existing state-of-the-art models. Source code is available at https://github.com/xicheng105/EEG-DBNet.

View on arXiv
Comments on this paper