Skip to main content

Adversarial Network Pruning By Filter Robustness Estimation

Xinlu Zhuang (Wuhan University); Yunjie Ge (Wuhan University); Baolin Zheng (Alibaba Group); Qian Wang (Wuhan University)

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
06 Jun 2023

Network pruning has been extensively studied in model compression to reduce neural networks' memory, latency, and computation cost. However, the pruned networks still suffer from the threat posed by adversarial examples, limiting the broader application of the pruned networks in safety-critical applications. Previous studies maintain the robustness of the pruned networks by combining adversarial training and network pruning but ignore preserving the robustness at a high sparsity ratio in structured pruning. To address such a problem, we propose an effective filter importance criterion, Filter Robustness Estimation (FRE), to evaluate the importance of filters by estimating their contribution to the adversarial training loss. Empirical results show that our FRE-based Robustness-aware Filter Pruning (FRFP) outperforms the state-of-the-art methods by 12.19\%$\sim$37.01\% of empirical robust accuracy on the CIFAR10 dataset with the VGG16 network at an extreme pruning ratio of 90\%.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00