Skip to main content

EVENT-BASED MULTIMODAL SPIKING NEURAL NETWORK WITH ATTENTION MECHANISM

Qianhui Liu, Dong Xing, Lang Feng, Huajin Tang, Gang Pan

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:10:19
11 May 2022

Human brain can effectively integrate visual and auditory information. Dynamic Vision Sensor (DVS) and Dynamic Audio Sensor (DAS) are event-based sensors imitating the mechanism of human retina and cochlea. Since the sensors record the visual and auditory input as asynchronous discrete events, they are inherently suitable to cooperate with the spiking neural network (SNN). Existing works of SNNs for processing events mainly focus on unimodality, however, audiovisual multimodal SNNs are still limited. In this paper, we propose an end-to-end event-based multimodal spiking neural network. The network consists of visual and auditory unimodal subnetworks and a novel attention-based cross-modal subnetwork for fusion. The attention mechanism measures the significance of each modality and allocates the weights to two modalities. We evaluate our proposed multimodal network on an event-based audiovisual joint dataset (MNIST-DVS and N-TIDIGITS datasets). Experimental results show the performance improvement of this multimodal network and the effectiveness of our proposed attention mechanism.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00