26
13
v1v2v3v4v5v6 (latest)

Privacy-Preserving Logistic Regression Training with A Faster Gradient Variant

Abstract

Logistic regression training over encrypted data has been an attractive idea to security concerns for years. In this paper, we propose a faster gradient variant called quadratic gradient\texttt{quadratic gradient} for privacy-preserving logistic regression training. The core of quadratic gradient\texttt{quadratic gradient} can be seen as an extension of the simplified fixed Hessian. We enhance Nesterov's accelerated gradient (NAG) and Adaptive Gradient Algorithm (Adagrad) respectively with quadratic gradient\texttt{quadratic gradient} and evaluate the enhanced algorithms on several datasets. %gradient ascentascent methods with this gradient variant on the gene dataset provided by the 2017 iDASH competition and other datasets. Experiments show that the enhanced methods have a state-of-the-art performance in convergence speed compared to the raw first-order gradient methods. We then adopt the enhanced NAG method to implement homomorphic logistic regression training, obtaining a comparable result by only 33 iterations. There is a promising chance that quadratic gradient\texttt{quadratic gradient} could be used to enhance other first-order gradient methods for general numerical optimization problems.

View on arXiv
Comments on this paper