In statistical analysis, Monte Carlo (MC) stands as a classical numerical integration method. When encountering challenging sample problem, Markov chain Monte Carlo (MCMC) is a commonly employed method. However, the MCMC estimator is biased after a fixed number of iterations. Unbiased MCMC, an advancement achieved through coupling techniques, addresses this bias issue in MCMC. It allows us to run many short chains in parallel. Quasi-Monte Carlo (QMC), known for its high order of convergence, is an alternative of MC. By incorporating the idea of QMC into MCMC, Markov chain quasi-Monte Carlo (MCQMC) effectively reduces the variance of MCMC, especially in Gibbs samplers. This work presents a novel approach that integrates unbiased MCMC with MCQMC, called as an unbiased MCQMC method. This method renders unbiased estimators while improving the rate of convergence significantly. Numerical experiments demonstrate that for Gibbs sampling, unbiased MCQMC with a sample size of $N$ yields a faster root mean square error (RMSE) rate than the \(O(N^{-1/2})\) rate of unbiased MCMC, toward an RMSE rate of \(O(N^{-1})\) for low-dimensional problems. Surprisingly, in a challenging problem of 1049-dimensional P\'olya Gamma Gibbs sampler, the RMSE can still be reduced by several times for moderate sample sizes. In the setting of parallelization, unbiased MCQMC also performs better than unbiased MCMC, even running with short chains.
翻译:在统计分析中,蒙特卡洛(MC)是一种经典的数值积分方法。当遇到困难的采样问题时,马尔可夫链蒙特卡洛(MCMC)是常用的方法。然而,固定迭代次数后的MCMC估计量存在偏差。通过耦合技术实现的无偏MCMC解决了MCMC中的偏差问题,使我们能够并行运行多个短链。拟蒙特卡洛(QMC)以其高收敛阶而闻名,是MC的一种替代方法。通过将QMC思想融入MCMC,马尔可夫链拟蒙特卡洛(MCQMC)有效降低了MCMC的方差,尤其在吉布斯采样器中。本文提出了一种将无偏MCMC与MCQMC相结合的新方法,称为无偏MCQMC方法。该方法在提供无偏估计量的同时,显著提高了收敛速度。数值实验表明,对于吉布斯采样,样本量为$N$的无偏MCQMC相比无偏MCMC的\(O(N^{-1/2})\)均方根误差(RMSE)速率具有更快的收敛速度,在低维问题中趋向于\(O(N^{-1})\)的RMSE速率。令人惊讶的是,在1049维P\'olya Gamma吉布斯采样器这一挑战性问题中,对于中等样本量,RMSE仍可降低数倍。在并行化设置中,即使运行短链,无偏MCQMC的表现也优于无偏MCMC。