Conic optimization plays a crucial role in many machine learning (ML) problems. However, practical algorithms for conic constrained ML problems with large datasets are often limited to specific use cases, as stochastic algorithms for general conic optimization remain underdeveloped. To fill this gap, we introduce a stochastic interior-point method (SIPM) framework for general conic optimization, along with four novel SIPM variants leveraging distinct stochastic gradient estimators. Under mild assumptions, we establish the global convergence rates of our proposed SIPMs, which, up to a logarithmic factor, match the best-known rates in stochastic unconstrained optimization. Finally, our numerical experiments on robust linear regression, multi-task relationship learning, and clustering data streams demonstrate the effectiveness and efficiency of our approach.
翻译:锥优化在众多机器学习问题中扮演着关键角色。然而,针对大规模数据集的锥约束机器学习问题,现有实用算法通常局限于特定应用场景,因为通用锥优化的随机算法仍处于发展不足的状态。为填补这一空白,我们提出了一个适用于通用锥优化的随机内点法框架,并基于四种不同的随机梯度估计器构建了四种新型随机内点法变体。在温和假设条件下,我们证明了所提随机内点法的全局收敛速率——在对数因子范围内——与随机无约束优化中的最优已知速率相当。最后,我们在鲁棒线性回归、多任务关系学习及数据流聚类任务上的数值实验验证了本方法的有效性与高效性。