Gate Trimming: One-Shot Channel Pruning For Efficient Convolutional Neural Networks
Fang Yu, Chuanqi Han, Pengcheng Wang, Xi Huang, Li Cui
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 00:05:50
Channel pruning is a promising technique of model compression and acceleration because it reduces the space and time complexity of convolutional neural networks (CNNs) while maintaining their performance. In existing methods, channel pruning is performed by iterative optimization or training with sparsity-induced regularization, which all undermine the utility due to their inefficiency. In this work, we propose a one-shot global pruning approach called Gate Trimming (GT), which is more efficient to compress the CNNs. To achieve this, GT performs the pruning operation once, avoiding expensive retraining or re-evaluation of channel redundancy. In addition, GT globally estimates the effect of channels across all layers by information gain (IG). Based on the IG of channels, GT accurately prunes the redundant channels and makes little negative effect on CNNs. The experimental results show that the proposed GT is superior to the state-of- the-art methods.
Chairs:
Raja Giryes