We identify the critical deviation scale governing Bayesian evidence accumulation in regular parametric testing. Under integrated Bayes risk with zero-one loss, the risk-optimal rejection boundary lies in a moderate deviation regime, with a square-root logarithmic inflation relative to the usual local asymptotic normal scale. Under Cramer regularity, local prior smoothness at the null, and symmetric loss, we derive the sharp threshold and show that its leading logarithmic term is universal across regular priors, while lower-order constants depend on the local prior density, Fisher information, and prior model odds. The result extends to one-parameter exponential families through local asymptotic normality and places Jeffreys' testing threshold, the Bayesian information criterion penalty, and Chernoff-Stein type error-exponent arguments within a common asymptotic moderate deviation framework.
翻译:暂无翻译