This paper addresses second-order stochastic optimization for estimating the minimizer of a convex function written as an expectation. A direct recursive estimation technique for the inverse Hessian matrix using a Robbins-Monro procedure is introduced. This approach enables to drastically reduces computational complexity. Above all, it allows to develop universal stochastic Newton methods and investigate the asymptotic efficiency of the proposed approach. This work so expands the application scope of secondorder algorithms in stochastic optimization.
翻译:本文研究用于估计凸函数(表示为期望形式)极小值的二阶随机优化问题。我们引入了一种使用Robbins-Monro过程的Hessian逆矩阵直接递归估计技术。该方法能显著降低计算复杂度。最重要的是,它使得开发通用随机牛顿方法成为可能,并可以研究所提出方法的渐近效率。这项工作拓展了二阶算法在随机优化中的应用范围。