Reliable decision-making with streaming data requires principled uncertainty quantification of online methods. While first-order methods enable efficient iterate updates, their inference procedures still require updating proper (covariance) matrices, incurring $O(d^2)$ time and memory complexity, and are sensitive to ill-conditioning and noise heterogeneity of the problem. This costly inference task offers an opportunity for more robust second-order methods, which are, however, bottlenecked by solving Newton systems with $O(d^3)$ complexity. In this paper, we address this gap by studying an online Newton method with Hessian averaging, where the Newton direction at each step is approximately computed using a sketch-and-project solver with Nesterov's acceleration, matching $O(d^2)$ complexity of first-order methods. For the proposed method, we quantify its uncertainty arising from both random data and randomized computation. Under standard smoothness and moment conditions, we establish global almost-sure convergence, prove asymptotic normality of the last iterate with a limiting covariance characterized by a Lyapunov equation, and develop a fully online covariance estimator with non-asymptotic convergence guarantees. We also connect the resulting uncertainty quantification to that of exact and sketched Newton methods without Nesterov's acceleration. Extensive experiments on regression models demonstrate the superiority of the proposed method for online inference.
翻译:流式数据下的可靠决策要求对在线方法进行原则性的不确定性量化。尽管一阶方法能高效更新迭代步,但其推理过程仍需更新适当的协方差矩阵,导致$O(d^2)$时间和内存复杂度,且对问题的病态性和噪声异质性敏感。这一昂贵的推理任务为更具鲁棒性的二阶方法提供了机会,然而此类方法受限于求解牛顿系统的$O(d^3)$复杂度瓶颈。本文通过研究一种具有黑塞矩阵平均的在线牛顿方法填补这一空白——该方法利用涅斯捷罗夫加速的“素描-求解”技术近似计算每步牛顿方向,从而匹配一阶方法的$O(d^2)$复杂度。针对所提方法,我们量化了由随机数据和随机计算共同引发的不确定性。在标准光滑性和矩条件下,我们建立了全局几乎必然收敛性,证明了末次迭代的渐近正态性(其极限协方差由李雅普诺夫方程刻画),并开发了具有非渐近收敛保证的完全在线协方差估计器。此外,我们将所得不确定性量化结果与无涅斯捷罗夫加速的精确和素描牛顿方法进行了关联。在回归模型上的大量实验表明,所提方法在在线推理中具有显著优势。