AAAI Publications, Thirty-Second AAAI Conference on Artificial Intelligence

Font Size: 
Uplink Communication Efficient Differentially Private Sparse Optimization With Feature-Wise Distributed Data
Jian Lou, Yiu-ming Cheung

Last modified: 2018-04-25


Preserving differential privacy during empirical risk minimization model training has been extensively studied under centralized and sample-wise distributed dataset settings. This paper considers a nearly unexplored context with features partitioned among different parties under privacy restriction. Motivated by the nearly optimal utility guarantee achieved by centralized private Frank-Wolfe algorithm (Talwar, Thakurta, and Zhang 2015), we develop a distributed variant with guaranteed privacy, utility and uplink communication complexity. To obtain these guarantees, we provide a much generalized convergence analysis for block-coordinate Frank-Wolfe under arbitrary sampling, which greatly extends known convergence results that are only applicable to two specific block sampling distributions. We also design an active feature sharing scheme by utilizing private Johnson-Lindenstrauss transform, which is the key to updating local partial gradients in a differentially private and communication efficient manner.


Differential Privacy,Frank-Wolfe Algorithm,Distributed Sparse Optimization

Full Text: PDF