Differentially Private Accelerated Optimization Algorithms

Authors
Publication date 2022
Journal SIAM Journal on Optimization
Volume | Issue number 32 | 2
Pages (from-to) 795-821
Organisations
  • Faculty of Economics and Business (FEB) - Amsterdam Business School Research Institute (ABS-RI)
Abstract
We present two classes of differentially private optimization algorithms derived from the well-known accelerated first-order methods. The first algorithm is inspired by Polyak’s heavy ball method and employs a smoothing approach to decrease the accumulated noise on the gradient steps required for differential privacy. The second class of algorithms are based on Nesterov’s accelerated gradient method and its recent multi-stage variant. We propose a noise dividing mechanism for the iterations of Nesterov’s method in order to improve the error behavior of the algorithm. The convergence rate analyses are provided for both the heavy ball and the Nesterov’s accelerated gradient method with the help of the dynamical system analysis techniques. Finally, we conclude with our numerical experiments showing that the presented algorithms have advantages over the well-known differentially private algorithms.
Document type Article
Language English
Published at https://doi.org/10.1137/20M1355847
Permalink to this page
Back