ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1810.02842
33
28
v1v2v3v4 (latest)

Cross-Subject Transfer Learning on High-Speed Steady-State Visual Evoked Potential-Based Brain-Computer Interfaces

5 October 2018
Kuan-Jung Chiang
Chun-Shu Wei
M. Nakanishi
T. Jung
ArXiv (abs)PDFHTML
Abstract

Steady-state visual evoked potential (SSVEP)-based brain computer-interfaces (BCIs) have shown its robustness in achieving high information transfer rate. State-of-the-art training-based SSVEP decoding methods such as extended Canonical Correlation Analysis (CCA) and Task-Related Component Analysis (TRCA) are the major players that elevate the efficiency of the SSVEP-based BCIs through an individualized calibration process. However, collecting sufficient calibration (e.g. training templates) data could be laborious and time-consuming, hindering the practicality in a real-world context. This study aims to develop a cross-subject transferring approach to reduce the need for training data from a new user. Study results showed that a new least-squares transformation (LST) method was able to significantly reduce the training templates required for a 40-class TRCA-based SSVEP BCI. The LST method may lead to numerous practical applications of plug-and-play high-speed SSVEP-based BCIs.

View on arXiv
Comments on this paper