In this paper we propose a new deterministic approximation method, called discretization approximation, for Bayesian computation. Discretization approximation is very simple to understand and to implement, It only requires calculating posterior density values as probability masses at pre-specified support points. The resulted discrete distribution can be a good approximation to the target posterior distribution. All posterior quantities, including means, standard deviations, and quantiles, can be approximated by those of this completely known discrete distribution. We establish the convergence rate of discretization approximation as the number of support points goes to infinity. If the support points are generated from quasi-Monte Carlo sequences, then the rate is actually the same as that in integration approximation, generally faster than the optimal statistical rate. In this sense, discretization approximation is superior to the popular Markov chain Monte Carlo method. We also provide random sampling and representation point construction methods from discretization approximation. Numerical examples including some benchmarks demonstrate that the proposed method performs quite well for both low-dimensional and high-dimensional cases.
翻译:本文提出一种用于贝叶斯计算的新型确定性逼近方法,称为离散化逼近。该方法原理简明、易于实现,仅需在预设支撑点处计算后验密度值作为概率质量。所得离散分布可有效逼近目标后验分布。所有后验统计量(包括均值、标准差和分位数)均可通过该完全已知的离散分布进行近似。我们建立了当支撑点数量趋于无穷时离散化逼近的收敛速率。若支撑点源自拟蒙特卡洛序列,则该速率与积分逼近的收敛速率相同,通常快于最优统计速率。在此意义上,离散化逼近优于流行的马尔可夫链蒙特卡洛方法。我们还提出了基于离散化逼近的随机抽样与代表性点构建方法。包含若干基准测试的数值算例表明,所提方法在低维与高维情形下均表现优异。