FINE-GRAINED PRIVATE KNOWLEDGE DISTILLATION
Yuntong Li (Guangzhou University); Shaowei Wang (Guangzhou University); Yingying Wang (Guangzhou University); Jin Li (Guangzhou University); Yuqiu Qian (Tencent Inc.); Bangzhou Xin (University of Science and Technology of China); Wei Yang (University of Science and Technology of China)
-
SPS
IEEE Members: $11.00
Non-members: $15.00
Knowledge distillation has emerged as a scalable and effective way for privacy-preserving machine learning. One remaining drawback is that it consumes privacy in a client-level manner. In order to attain fine-grained privacy accountant and improve utility, this work proposes a model-free \textit{reverse $k$-NN labeling} method towards record-level private knowledge distillation, where each private record is employed for labeling at most $k$ queries. Theoretically, we provide bounds of labeling error rate under the centralized/local model of differential privacy. Experimentally, we demonstrate that it achieves new state-of-the-art accuracy in MNIST/SVHN/CIFAR-10 dataset with one order of magnitude lower of privacy loss.