Skip to main content

HE-GAN: Differentially Private GAN using Hamiltonian Monte Carlo based Exponential Mechanism

Usman Hassan (University of Kentucky); Dongjie Chen (University of California, Davis); Sen-ching S Cheung (University of Kentucky); Chen-Nee Chuah (University of California Davis)

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
06 Jun 2023

Differentially-private (DP) Generative Adversarial Networks (GAN) can be used to protect the privacy of training data and support public downstream learning tasks with synthetic data. However, typical DP mechanisms add noise to the training process and can lead to various convergence problems. We propose HE-GAN, a DP generative framework that eliminates noise addition by using Exponential Mechanism (EM) on the privacy-factor-adjusted posterior predictive distribution of a classifier trained on the private data. EM is more general than many other DP mechanisms including Laplacian and Gaussian mechanisms. EM's reliance on sampling the output space also prevents the DP noise from corrupting the training process. However, there are two challenges: first, sampling the posterior distribution of the private discriminative classifier may not be able to produce high-quality synthetic samples. Instead, we sample from the latent space of a publicly-trained GAN to optimize the private posterior. Second, we use the highly effective Hamiltonian Monte Carlo (HMC) method for latent space sampling. We perform experiments on MNIST and Fashion-MNIST under public-private splits. Results show that HE-GAN can achieve downstream classification accuracy on par with or better than state-of-the-art scheme over a wide range of privacy budgets.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00