Bayesian inference typically relies on specifying a parametric model that approximates the data-generating process. However, misspecified models can yield poor convergence rates and unreliable posterior calibration. Bayesian empirical likelihood offers a semi-parametric alternative by replacing the parametric likelihood with a profile empirical likelihood defined through moment constraints, thereby avoiding explicit distributional assumptions. Despite these advantages, Bayesian empirical likelihood faces substantial computational challenges, including the need to solve a constrained optimization problem for each likelihood evaluation and difficulties with non-convex posterior support, particularly in small-sample settings. This paper introduces a variational approach based on expectation-propagation to approximate the Bayesian empirical-likelihood posterior, balancing computational cost and accuracy without altering the target posterior via adjustments such as pseudo-observations. Empirically, we show that our approach can achieve a superior cost-accuracy trade-off relative to existing methods, including Hamiltonian Monte Carlo and variational Bayes. Theoretically, we show that the approximation and the Bayesian empirical-likelihood posterior are asymptotically equivalent.
翻译:贝叶斯推断通常依赖于指定一个近似数据生成过程的参数模型。然而,误设模型可能导致较差的收敛速率和不可靠的后验校准。贝叶斯经验似然提供了一种半参数替代方案,它通过矩约束定义的轮廓经验似然取代参数似然,从而避免显式的分布假设。尽管具有这些优势,贝叶斯经验似然仍面临显著的计算挑战,包括每次似然评估都需要求解一个约束优化问题,以及后验支撑集非凸性带来的困难,尤其是在小样本场景中。本文引入一种基于期望传播的变分方法来近似贝叶斯经验似然后验,在计算成本与精度之间取得平衡,且无需通过伪观测值等调整改变目标后验。实证研究表明,相较于哈密顿蒙特卡洛和变分贝叶斯等现有方法,我们的方法能够实现更优的成本-精度权衡。理论上,我们证明了该近似与贝叶斯经验似然后验是渐近等价的。