Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
08 Jun 2023

While the performance of offline neural speech separation systems has been greatly advanced by the recent development of novel neural network architectures, there is typically an inevitable performance gap between the systems and their online variants. We investigate how RNN-based offline neural speech separation systems can be changed into their online counterparts while mitigating performance degradation. We decompose or reorganize the forward and backward RNN layers in a bidirectional RNN layer to form an online and offline path, enabling the model to perform both online and offline processing with the same set of model parameters. We introduce two training strategies: a pretrained offline model and a multitask training objective. Experiment results show that the proposed layer decomposition and reorganization schemes and training strategies can effectively mitigate the performance gap between two RNN-based offline separation models and their online variants.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00