We introduce BEDS (Bayesian Emergent Dissipative Structures), a formal framework for analyzing inference systems that must maintain beliefs continuously under energy constraints. Unlike classical computational models that assume perfect memory and focus on one-shot computation, BEDS explicitly incorporates dissipation (information loss over time) as a fundamental constraint. We prove a central result linking energy, precision, and dissipation: maintaining a belief with precision $τ$ against dissipation rate $γ$ requires power $P \geq γk_{\rm B} T / 2$, with scaling $P \propto γ\cdot τ$. This establishes a fundamental thermodynamic cost for continuous inference. We define three classes of problems -- BEDS-attainable, BEDS-maintainable, and BEDS-crystallizable -- and show these are distinct from classical decidability. We propose the Gödel-Landauer-Prigogine conjecture, suggesting that closure pathologies across formal systems, computation, and thermodynamics share a common structure.
翻译:本文提出BEDS(贝叶斯涌现耗散结构),这是一个用于分析必须在能量约束下持续维持信念的推理系统的形式化框架。与假设完美记忆并专注于单次计算的经典计算模型不同,BEDS明确将耗散(信息随时间损失)作为基本约束纳入考量。我们证明了一个连接能量、精度与耗散的核心结果:以精度 $τ$ 维持信念对抗耗散率 $γ$ 需要功率 $P \geq γk_{\rm B} T / 2$,且满足标度关系 $P \propto γ\cdot τ$。这确立了连续推理的基本热力学代价。我们定义了三类问题——BEDS可达、BEDS可维持与BEDS可结晶——并证明这些类别与经典可判定性概念存在本质区别。我们进一步提出哥德尔-兰道尔-普里高津猜想,指出形式系统、计算与热力学中的封闭性病理现象具有共同的结构基础。