In Bayesian theory, the role of information is central. The influence exerted by prior information on posterior outcomes often jeopardizes Bayesian studies, due to the potentially subjective nature of the prior choice. In modeling where a priori knowledge is lacking, the reference prior theory emerges as a proficient tool. Based on the criterion of mutual information, this theory makes it possible to construct a non-informative prior whose choice can be qualified as objective. In this paper, we contribute to the enrichment of reference prior theory. Indeed, we unveil an original analogy between reference prior theory and Global Sensitivity Analysis, from which we propose a natural generalization of the mutual information definition. Leveraging dissimilarity measures between probability distributions, such as f-divergences, we provide a formalized framework for what we term generalized reference priors. Our main result offers a limit of mutual information, simplifying the definition of reference priors as its maximal arguments. This approach opens a new way that facilitates the theoretical derivation of reference priors under constraints or within specific classes. In the absence of constraints, we further prove that the Jeffreys prior maximizes the generalized mutual information considered.
翻译:在贝叶斯理论中,信息的作用至关重要。由于先验选择的潜在主观性,先验信息对后验结果的影响常常危及贝叶斯研究的可靠性。在缺乏先验知识的建模中,参考先验理论成为一种有效的工具。基于互信息准则,该理论能够构建一种可被视为客观的非信息先验。本文致力于丰富参考先验理论。具体而言,我们揭示了参考先验理论与全局灵敏度分析之间的一种原始类比关系,并据此提出了互信息定义的自然推广。利用概率分布之间的差异度量(例如f-散度),我们为所谓的广义参考先验建立了一个形式化框架。我们的主要结果给出了互信息的极限形式,将参考先验的定义简化为其最大论证。这一方法开辟了新的途径,便于在约束条件下或特定类别中理论推导参考先验。在无约束条件下,我们进一步证明杰弗里斯先验最大化所考虑的广义互信息。