Finite difference (FD) approximation is a classic approach to stochastic gradient estimation when only noisy function realizations are available. In this paper, we first provide a sample-driven method via the bootstrap technique to estimate the optimal perturbation, and then propose an efficient FD estimator based on correlated samples at the estimated optimal perturbation. Furthermore, theoretical analyses of both the perturbation estimator and the FD estimator reveal that, {\it surprisingly}, the correlation enables the proposed FD estimator to achieve a reduction in variance and, in some cases, a decrease in bias compared to the traditional optimal FD estimator. Numerical results confirm the efficiency of our estimators and align well with the theory presented, especially in scenarios with small sample sizes. Finally, we apply the estimator to solve derivative-free optimization (DFO) problems, and numerical studies show that DFO problems with 100 dimensions can be effectively solved.
翻译:有限差分(FD)近似是一种经典的随机梯度估计方法,适用于仅能获得含噪声函数实现值的情况。本文首先通过自助法(bootstrap)技术提出一种样本驱动的扰动参数估计方法,随后基于在估计所得最优扰动处生成的相关样本,构建一种高效的有限差分估计器。进一步的理论分析表明,无论是针对扰动估计器还是有限差分估计器,{\it 令人惊讶的是},样本间的相关性使得所提出的有限差分估计器相比传统的最优有限差分估计器能够降低方差,并在某些情况下减少偏差。数值结果验证了所提估计器的有效性,并与理论分析高度吻合,尤其在样本量较小的场景中表现突出。最后,我们将该估计器应用于求解无导数优化(DFO)问题,数值研究表明,该方法能够有效求解维度高达100的无导数优化问题。