We consider the minimisation problem of submodular functions and investigate the application of a zeroth-order method to this problem. The method is based on exploiting a Gaussian smoothing random oracle to estimate the smoothed function gradient. We prove the convergence of the algorithm to a global $\epsilon$-approximate solution in the offline case and show that the algorithm is Hannan-consistent in the online case with respect to static regret. Moreover, we show that the algorithm achieves $O(\sqrt{NP_N^\ast})$ dynamic regret, where $N$ is the number of iterations and $P_N^\ast$ is the path length. The complexity analysis and hyperparameter selection are presented for all the cases. The theoretical results are illustrated via numerical examples.
翻译:本文考虑次模函数的最小化问题,并研究零阶方法在该问题中的应用。该方法基于利用高斯平滑随机预言机来估计平滑函数的梯度。我们证明了该算法在离线情况下能够收敛到全局 $\epsilon$-近似解,并表明在在线情况下,算法关于静态遗憾是Hannan一致的。此外,我们证明了算法能够达到 $O(\sqrt{NP_N^\ast})$ 的动态遗憾,其中 $N$ 为迭代次数,$P_N^\ast$ 为路径长度。我们对所有情况给出了复杂度分析和超参数选择。理论结果通过数值算例进行了说明。