Stochastic-gradient MCMC methods enable scalable Bayesian posterior sampling but often suffer from sensitivity to minibatch size and gradient noise. To address this, we propose Stochastic Gradient Lattice Random Walk (SGLRW), an extension of the Lattice Random Walk discretization. Unlike conventional Stochastic Gradient Langevin Dynamics (SGLD), SGLRW introduces stochastic noise only through the off-diagonal elements of the update covariance; this yields greater robustness to minibatch size while retaining asymptotic correctness. Furthermore, as comparison we analyze a natural analogue of SGLD utilizing gradient clipping. Experimental validation on Bayesian regression and classification demonstrates that SGLRW remains stable in regimes where SGLD fails, including in the presence of heavy-tailed gradient noise, and matches or improves predictive performance.
翻译:随机梯度MCMC方法能够实现可扩展的贝叶斯后验采样,但通常对批次大小和梯度噪声较为敏感。为解决这一问题,我们提出了随机梯度格点随机游走(SGLRW)方法,这是格点随机游走离散化方案的扩展。与传统的随机梯度朗之万动力学(SGLD)不同,SGLRW仅通过更新协方差矩阵的非对角元素引入随机噪声;这使得该方法对批次大小具有更强的鲁棒性,同时保持渐近正确性。此外,作为对比,我们分析了采用梯度裁剪的SGLD自然类比方法。在贝叶斯回归和分类任务上的实验验证表明,SGLRW在SGLD失效的场景下(包括存在重尾梯度噪声时)仍能保持稳定,并达到或优于预测性能。