Privacy-Preserving Logistic Regression Training with A Faster Gradient Variant

Logistic regression training over encrypted data has been an attractive idea to security concerns for years. In this paper, we propose a faster gradient variant called for privacy-preserving logistic regression training. The core of can be seen as an extension of the simplified fixed Hessian. We enhance Nesterov's accelerated gradient (NAG) and Adaptive Gradient Algorithm (Adagrad) respectively with and evaluate the enhanced algorithms on several datasets. %gradient methods with this gradient variant on the gene dataset provided by the 2017 iDASH competition and other datasets. Experiments show that the enhanced methods have a state-of-the-art performance in convergence speed compared to the raw first-order gradient methods. We then adopt the enhanced NAG method to implement homomorphic logistic regression training, obtaining a comparable result by only iterations. There is a promising chance that could be used to enhance other first-order gradient methods for general numerical optimization problems.
View on arXiv