There is currently a renewed interest in the Bayesian predictive approach to statistics. This paper offers a review on foundational concepts and focuses on predictive modeling, which by directly reasoning on prediction, bypasses inferential models or may characterize them. We detail predictive characterizations in exchangeable and partially exchangeable settings, for a large variety of data structures, and hint at new directions. The underlying concept is that Bayesian predictive rules are probabilistic learning rules, formalizing through conditional probability how we learn on future events given the available information. This concept has implications in any statistical problem and in inference, from classic contexts to less explored challenges, such as providing Bayesian uncertainty quantification to predictive algorithms in data science, as we show in the last part of the paper. The paper gives a historical overview, but also includes a few new results, presents some recent developments and poses some open questions.
翻译:当前学界对贝叶斯预测统计方法正重新产生浓厚兴趣。本文对基础概念进行系统性梳理,重点关注预测建模方法——该方法通过直接对预测过程进行推演,既可绕过传统推断模型,亦可对其特征进行刻画。我们详细阐述了在可交换及部分可交换框架下,针对多种数据结构的预测表征方法,并展望了新的研究方向。其核心思想在于:贝叶斯预测规则本质上是概率学习规则,通过条件概率形式化地描述了如何基于现有信息对未来事件进行认知更新。这一理念对从经典场景到新兴挑战(如为数据科学中的预测算法提供贝叶斯不确定性量化)的各类统计问题及推断过程均具有重要启示,正如我们在文末所展示的。本文不仅提供了历史视角的综述,还包含若干新结论,介绍了最新进展,并提出了若干待解问题。