Singular learning theory characterizes Bayesian models with non-identifiable parameterizations through two central quantities: the real log canonical threshold (RLCT), which governs marginal likelihood asymptotics, and the singular fluctuation, which determines second-order generalization behavior and the complexity term in WAIC. While the geometric meaning of the RLCT is well understood, the interpretation of singular fluctuation has remained comparatively opaque. We show that singular fluctuation admits a precise thermodynamic interpretation. Under a tempered (Gibbs) posterior, it is exactly the curvature of the Bayesian free energy with respect to inverse temperature; equivalently, the variance of the log-likelihood observable. In this sense, singular fluctuation is the statistical analogue of specific heat. This identity clarifies why singular fluctuation controls the equation of state relating training and generalization error and explains the success of WAIC in singular models: WAIC estimates a fluctuation coefficient rather than a parameter dimension. Across Gaussian mixture models and reduced-rank regression, we demonstrate that singular fluctuation behaves as a thermodynamic response coefficient. As temperature decreases, posterior reorganization suppresses fluctuation directions that affect predictive performance, and model-specific geometric observables track the decay of singular fluctuation. Rather than introducing new asymptotic expansions, this work unifies existing variance identities, equation-of-state results, and WAIC complexity corrections under a single free-energy curvature framework.
翻译:奇异学习理论通过两个核心量刻画具有不可识别参数化的贝叶斯模型:控制边缘似然渐近性的实对数典则阈值(RLCT),以及决定二阶泛化行为和WAIC复杂度项的奇异涨落。虽然RLCT的几何意义已被充分理解,但奇异涨落的解释相对模糊。我们证明奇异涨落具有精确的热力学解释:在回火(吉布斯)后验分布下,它恰好是贝叶斯自由能关于逆温度的曲率,等价于对数似然可观测量方差。在此意义上,奇异涨落是统计类比中的比热。该恒等式阐明了为何奇异涨落控制着连接训练误差与泛化误差的状态方程,并解释了WAIC在奇异模型中成功的原因:WAIC估计的是涨落系数而非参数维度。通过高斯混合模型和降秩回归实验,我们证明奇异涨落表现为热力学响应系数。随着温度降低,后验重组会抑制影响预测性能的涨落方向,且模型特定的几何可观测量可追踪奇异涨落的衰减。本研究并未引入新的渐近展开,而是将现有方差恒等式、状态方程结果和WAIC复杂度修正统一于单一的自由能曲率框架下。