Tensor Reordering For Cnn Compression
Matej Ulicny, Vladimir A. Krylov, Rozenn Dahyot
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 00:08:45
We show how parameter redundancy in Convolutional Neural Network (CNN) filters can be effectively reduced by pruning in spectral domain. Specifically, the representation extracted via Discrete Cosine Transform (DCT) is more conducive for pruning than the original space. By relying on a combination of weight tensor reshaping and reordering we achieve high levels of layer compression with just minor accuracy loss. Our approach is applied to compress pretrained CNNs and we show that minor additional fine-tuning allows our method to recover the original model performance after a significant parameter reduction. We validate our approach on ResNet-50 and MobileNet-V2 architectures for ImageNet classification task.
Chairs:
Igor Fedorov