Witchcraft: Efficient Pgd Attacks With Random Step Size
Ping-Yeh Chiang, Micah Goldblum, Renkun Ni, Steven Reich, Jonas Geiping, Tom Goldstein, Ali Shafahi
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 12:40
State-of-the-art adversarial attacks on neural networks use expensive iterative methods and numerous random restarts from different initial points. Iterative FGSM-based methods without restarts trade off performance for computational efficiency because they do not adequately explore the image space and are highly sensitive to the choice of step size. We propose a variant of Projected Gradient Descent (PGD) that uses a random step size to improve performance without resorting to expensive random restarts. Our method, Wide Iterative Stochastic crafting (WITCHcraft), achieves results superior to the classical PGD attack on the CIFAR-10 and MNIST data sets but without additional computational cost. This simple modification of PGD makes crafting attacks more economical, which is important in situations like adversarial training where attacks need to be crafted in real time.