Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 0:15:06
19 Jan 2021

This paper investigates node-pruning-based compression for non-uniform deep learning models such as acoustic models in automatic speech recognition (ASR). Node pruning for small footprint ASR has been well studied, but most studies assumed a sigmoid as an activation function and uniform or simple fully-connected neural networks without bypass connections. We propose a node pruning method that can be applied to non-sigmoid functions such as ReLU and that can deal with network topology related issues such as bypass connections. To deal with non-sigmoid functions, we extend a node entropy technique to estimate node activities. To cope with non-uniform network topology, we propose three criteria; inter-layer pairing, no bypass connection pruning, and layer-based pruning rate configuration. The proposed method as a combination of these four techniques and criteria was applied to compress a Kaldi鈥檚 acoustic model with ReLU as a non-linear function, time delay neural networks (TDNN) and bypass connections inspired by residual networks. Experimental results showed that the proposed method achieved a 31% speed increase while maintaining the ASR accuracy to be comparable by taking network topology into consideration.

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00