Skip to main content

STUDY OF PRE-PROCESSING DEFENSES AGAINST ADVERSARIAL ATTACKS ON STATE-OF-THE-ART SPEAKER RECOGNITION SYSTEMS

Sonal Joshi, Jes�s Villalba, Piotr ?elasko, Laureano Moro-Vel�zquez, Najim Dehak

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:06:15
09 May 2022

Adversarial examples are designed to fool the speaker recognition (SR) system by adding a carefully crafted human-imperceptible noise to the speech signals. Posing a severe security threat to state-of-the-art SR systems, it becomes vital to deep-dive and study their vulnerabilities and propose countermeasures that can protect the systems against these attacks. Addressing these concerns, we investigated how state-of-the-art x-vector based SR systems are affected by white-box adversarial attacks like fast gradient sign method (FGSM), iterative-FGSM, projected gradient descent (PGD), and Carlini-Wagner (CW) attack. To mitigate against these attacks, we investigated four pre-processing defenses which do not need adversarial examples during training. The four pre-processing defenses?viz. randomized smoothing, DefenseGAN, variational autoencoder (VAE), and Parallel WaveGAN vocoder (PWG) are compared against the baseline defense of adversarial training. Performing powerful adaptive white-box adversarial attack (i.e., when the adversary has full knowledge of the system, including the defense), our conclusions indicate that SR systems were extremely vulnerable under BIM, PGD, and CW attacks. Among the proposed pre-processing defenses, PWG combined with randomized smoothing offers the most protection against the attacks, with accuracy averaging 93% compared to 52% in the undefended system and an absolute improvement >90% for BIM attacks with L?>0.001 and CW attack.

Tags:

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00