Our proposal is on a new stochastic optimizer for non-convex and possibly non-smooth objective functions typically defined over large dimensional design spaces. Towards this, we have tried to bridge noise-assisted global search and faster local convergence, the latter being the characteristic feature of a Newton-like search. Our specific scheme -- acronymed FINDER (Filtering Informed Newton-like and Derivative-free Evolutionary Recursion), exploits the nonlinear stochastic filtering equations to arrive at a derivative-free update that has resemblance with the Newton search employing the inverse Hessian of the objective function. Following certain simplifications of the update to enable a linear scaling with dimension and a few other enhancements, we apply FINDER to a range of problems, starting with some IEEE benchmark objective functions to a couple of archetypal data-driven problems in deep networks to certain cases of physics-informed deep networks. The performance of the new method vis-\'a-vis the well-known Adam and a few others bears evidence to its promise and potentialities for large dimensional optimization problems of practical interest.
翻译:本文提出一种适用于非凸且可能非光滑目标函数的新型随机优化器,该类函数通常定义于高维设计空间。为此,我们尝试将噪声辅助的全局搜索与更快的局部收敛相结合,后者是类牛顿搜索的典型特征。我们提出的具体方案——简称FINDER(基于滤波的类牛顿无导数进化递归算法)——利用非线性随机滤波方程推导出无导数更新规则,该规则与采用目标函数海森逆矩阵的牛顿搜索具有相似性。通过对更新规则进行简化以实现维度线性缩放及其他若干改进后,我们将FINDER应用于一系列问题:从IEEE基准目标函数测试,到深度网络中若干典型数据驱动问题,再到物理信息深度网络的特定案例。新方法相较于著名的Adam等优化器的性能表现,证明了其在具有实际意义的高维优化问题中的应用前景与潜力。