Generalized Bayesian Inference (GBI) provides a flexible framework for updating prior distributions using various loss functions instead of the traditional likelihoods, thereby enhancing the model robustness to model misspecification. However, GBI often suffers the problem associated with intractable likelihoods. Kernelized Stein Discrepancy (KSD), as utilized in a recent study, addresses this challenge by relying only on the gradient of the log-likelihood. Despite this innovation, KSD-Bayes suffers from critical pathologies, including insensitivity to well-separated modes in multimodal posteriors. To address this limitation, we propose a weighted KSD method that retains computational efficiency while effectively capturing multimodal structures. Our method improves the GBI framework for handling intractable multimodal posteriors while maintaining key theoretical properties such as posterior consistency and asymptotic normality. Experimental results demonstrate that our method substantially improves mode sensitivity compared to standard KSD-Bayes, while retaining robust performance in unimodal settings and in the presence of outliers.
翻译:广义贝叶斯推断(GBI)为利用多种损失函数(而非传统似然函数)更新先验分布提供了一个灵活框架,从而增强了模型对模型设定错误的鲁棒性。然而,GBI常面临似然函数难以处理的问题。近期研究中采用的核化斯坦偏差(KSD)通过仅依赖对数似然的梯度解决了这一难题。尽管具有创新性,KSD-贝叶斯方法仍存在关键缺陷,包括对多峰后验中分离良好的模态不敏感。为突破此限制,我们提出一种加权KSD方法,该方法在保持计算效率的同时能有效捕捉多峰结构。我们的方法改进了GBI框架处理难处理多峰后验的能力,同时保持了后验一致性、渐近正态性等关键理论性质。实验结果表明,与标准KSD-贝叶斯方法相比,本方法显著提升了模态敏感性,同时在单峰场景和存在异常值的情况下保持了稳健性能。