Training logistic regression over encrypted data has been a compelling approach in addressing security concerns for several years. In this paper, we introduce an efficient gradient variant, called $quadratic$ $gradient$, for privacy-preserving logistic regression training. We enhance Nesterov's Accelerated Gradient (NAG), Adaptive Gradient Algorithm (Adagrad) and Adam algorithms by incorporating their quadratic gradients and evaluate these improved algorithms on various datasets. Experimental results demonstrate that the enhanced algorithms achieve significantly improved convergence speed compared to traditional first-order gradient methods. Moreover, we applied the enhanced NAG method to implement homomorphic logistic regression training, achieving comparable results within just 4 iterations. There is a good chance that the quadratic gradient approach could integrate first-order gradient descent/ascent algorithms with the second-order Newton-Raphson methods, and that it could be applied to a wide range of numerical optimization problems.
翻译:多年来,在加密数据上训练逻辑回归一直是解决安全问题的有效方法。本文提出一种高效的梯度变体,称为二次梯度,用于隐私保护的逻辑回归训练。我们通过引入二次梯度改进了Nesterov加速梯度、自适应梯度算法和Adam算法,并在多个数据集上评估了这些改进算法。实验结果表明,与传统一阶梯度方法相比,改进算法实现了显著提升的收敛速度。此外,我们将改进的NAG方法应用于同态逻辑回归训练,仅需4次迭代即可获得可比结果。二次梯度方法有望将一阶梯度下降/上升算法与二阶牛顿-拉弗森方法相结合,并可广泛应用于各类数值优化问题。