Decoding Neural Representations Of Rhythmic Sounds From Magnetoencephalography
Pei-Chun Chang, Jia-Ren Chang, Po-Yu Chen, Li-Kai Cheng, Jen-Chuen Hsieh, Hsin-Yen Yu, Li-Fen Chen, Yong-Sheng Chen
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 00:09:47
Neuroscience studies have revealed neural processes involving rhythm perception, suggesting that brain encodes rhythmic sounds and embeds information in neural activity. In this work, we investigate how to extract rhythmic information embedded in the brain responses and to decode the original audio waveforms from the extracted information. A spatiotemporal convolutional neural network is adopted to extract compact rhythm-related representations from the noninvasively measured magnetoencephalographic (MEG) signals evoked by listening to rhythmic sounds. These learned MEG representations are then used to condition an audio generator network for the synthesis of the original rhythmic sounds. In the experiments, we evaluated the proposed method by using the MEG signals recorded from eight participants and demonstrated that the generated rhythms are highly related to those evoking the MEG signals. Interestingly, we found that the auditory-related MEG channels reveal high importance in en- coding rhythmic representations, the distribution of these representations relate to the timing of beats, and the behavior performance is consistent with the performance of neural decoding. These results suggest that the proposed method can synthesize rhythms by decoding neural representations from MEG.
Chairs:
Arvind Rao