Prediction is a central task of statistics and machine learning, yet many inferential settings provide only partial information, typically in the form of moment constraints or estimating equations. We develop a finite, fully Bayesian framework for propagating such partial information through predictive distributions. Building on de Finetti's representation theorem, we construct a curvature-adaptive version of exchangeable updating that operates directly under finite constraints, yielding an explicit discrete-Gaussian mixture that quantifies predictive uncertainty. The resulting finite-sample bounds depend on the smallest eigenvalue of the information-geometric Hessian, which measures the curvature and identification strength of the constraint manifold. This approach unifies empirical likelihood, Bayesian empirical likelihood, and generalized method-of-moments estimation within a common predictive geometry. On the operational side, it provides computable curvature-sensitive uncertainty bounds for constrained prediction; on the theoretical side, it recovers de Finetti's coherence, Doob's martingale convergence and local asymptotic normality as limiting cases of the same finite mechanism. Our framework thus offers a constructive bridge between partial information and full Bayesian prediction.
翻译:预测是统计学与机器学习的核心任务,然而许多推断设定仅提供部分信息,通常以矩约束或估计方程的形式呈现。我们建立了一个有限样本、完全贝叶斯的框架,用于将此类部分信息通过预测分布进行传播。基于德·菲内蒂表示定理,我们构建了一种曲率自适应的可交换更新方法,直接在有限约束条件下操作,得到一个显式的离散-高斯混合分布以量化预测不确定性。所得的有限样本界依赖于信息几何海森矩阵的最小特征值,该特征值度量了约束流形的曲率与识别强度。该方法将经验似然、贝叶斯经验似然以及广义矩估计方法统一于一个共同的预测几何框架中。在操作层面,它为约束预测提供了可计算的曲率敏感不确定性界;在理论层面,它将德·菲内蒂的协调性、杜布鞅收敛性以及局部渐近正态性恢复为同一有限机制的极限情形。因此,我们的框架为部分信息与完全贝叶斯预测之间构建了一座建设性的桥梁。