Training logistic regression over encrypted data has emerged as a prominent approach to addressing security concerns in recent years. In this paper, we introduce an efficient gradient variant, termed the \textit{quadratic gradient}, which is specifically designed for privacy-preserving logistic regression while remaining equally effective in plaintext optimization. By incorporating this quadratic gradient, we enhance Nesterov's Accelerated Gradient (NAG), Adaptive Gradient (AdaGrad), and Adam algorithms. We evaluate these enhanced algorithms across various datasets, with experimental results demonstrating state-of-the-art convergence rates that significantly outperform traditional first-order gradient methods. Furthermore, we apply the enhanced NAG method to implement homomorphic logistic regression training, achieving comparable performance within only four iterations. The proposed quadratic-gradient approach offers a unified framework that synergizes the advantages of first-order gradient methods and second-order Newton-type methods, suggesting broad applicability to diverse numerical optimization tasks.
翻译:近年来,在加密数据上训练逻辑回归已成为解决安全问题的突出方法。本文提出一种高效的梯度变体,称为\textit{二次梯度},该变体专为隐私保护逻辑回归设计,同时在明文优化中保持同等有效性。通过引入该二次梯度,我们改进了Nesterov加速梯度(NAG)、自适应梯度(AdaGrad)和Adam算法。我们在多个数据集上评估这些增强算法,实验结果表明其收敛速度达到最先进水平,显著优于传统一阶梯度方法。此外,我们将增强的NAG方法应用于同态逻辑回归训练,仅需四次迭代即可获得相当的性能。所提出的二次梯度方法提供了一个统一框架,融合了一阶梯度方法与二阶牛顿类方法的优势,表明其可广泛应用于多种数值优化任务。