Skip to main content

META LEARNING WITH ADAPTIVE LOSS WEIGHT FOR LOW-RESOURCE SPEECH RECOGNITION

Qiulin Wang (Xiamen University); Wenxuan Hu (Xiamen University); Lin Li (Xiamen University); Qingyang Hong (Xiamen University)

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
07 Jun 2023

Model Agnostic Meta-Learning (MAML) is an effective meta-learning algorithm for low-resource automatic speech recognition (ASR). It uses gradient descent to learn the initialization parameters of the model through various languages, making the model quickly adapt to unseen low-resource languages. But MAML is unstable due to its unique bilevel loss backward structure, which significantly affects the stability and generalization of the model. Since various languages have different contributions to the target language, the loss weights corresponding to the effects of diverse languages require costly manual adjustment in the training stage. Proper selection of these weights will influence the performance of the entire model. In this paper, we propose to apply a loss weight adaption method to MAML using Convolutional Neural Network (CNN) with Homoscedastic Uncertainty. The results of experiments showed that the proposed method outperformed previous gradient-based meta-learning methods and other loss weights adaption methods, and it further improved the stability and effectiveness of MAML. In this paper, we propose to apply a loss weight adaption method to MAML based on Homoscedastic Uncertainty and Convolutional Neural Network (CNN). The results of experiments proved that the proposed method outperforms previous gradient-based meta-learning methods and other loss weights adaption methods, and it further enhances the stability and effectiveness of MAML.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00