Skip to main content

MULTI-LEVEL SPATIAL-TEMPORAL ADAPTATION NETWORK FOR MOTOR IMAGERY CLASSIFICATION

Wei Xu, Jing Wang, Ziyu Jia, Zhiqing Hong, Yunze Li, Youfang Lin

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:05:42
10 May 2022

Electroencephalogram (EEG) signals for motor imagery (MI) are easily influenced by the environment and the state of the subject, which exhibit temporal and spatial variance. And this variance is more significant across subjects and sessions, which imposes limitations on the cross-domain MI tasks. To address this problem, we propose a Multi-level Spatial-Temporal Adaptation Network (MSTAN), extracting domain-invariant multi-level spatial-temporal features to overcome domain differences. First, stacked spatial-temporal graph convolution (STGCN) layers and an attention-based readout module are designed to extract spatial-temporal patterns of EEGs at multiple levels. An adaptation scheme is then introduced to narrow domain differences: 1) Individual graph parameters for the source and target domains are designed at each STGCN layer to capture the domain-specific brain region dynamic relationships; 2) The differences of spatial-temporal features between the source and target domain are reduced by minimizing the distribution distance. Experiments are conducted to evaluate the proposed method on a public dataset and the results show that our method achieves state-of-the-art performance in cross-domain motor imagery classification.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00