This book aims to provide a graduate-level introduction to advanced topics in Markov chain Monte Carlo (MCMC) algorithms, as applied broadly in the Bayesian computational context. Most, if not all of these topics (stochastic gradient MCMC, non-reversible MCMC, continuous time MCMC, and new techniques for convergence assessment) have emerged as recently as the last decade, and have driven substantial recent practical and theoretical advances in the field. A particular focus is on methods that are scalable with respect to either the amount of data, or the data dimension, motivated by the emerging high-priority application areas in machine learning and AI.
翻译:本书旨在为研究生层次提供关于马尔可夫链蒙特卡洛(MCMC)算法在贝叶斯计算框架中广泛应用的进阶主题介绍。这些主题(包括随机梯度MCMC、不可逆MCMC、连续时间MCMC以及收敛性评估新技术)大多是在近十年内涌现的,并推动了该领域近期重大的理论与实践进展。特别关注那些在数据量或数据维度上具有可扩展性的方法,这主要受到机器学习和人工智能领域新兴的高优先级应用需求的驱动。