G\'acs' coarse-grained algorithmic entropy leverages universal computation to quantify the information content of any given physical state. Unlike the Boltzmann and Gibbs-Shannon entropies, it requires no prior commitment to macrovariables or probabilistic ensembles, rendering it applicable to settings arbitrarily far from equilibrium. For measure-preserving dynamical systems equipped with a Markovian coarse-graining, we prove a number of fluctuation inequalities. These include algorithmic versions of Jarzynski's equality, Landauer's principle, and the second law of thermodynamics. In general, the algorithmic entropy determines a system's actual capacity to do work from an individual state, whereas the Gibbs-Shannon entropy only gives the mean capacity to do work from a state ensemble that is known a priori.
翻译:Gács的粗粒度算法熵利用通用计算来量化任意给定物理状态的信息含量。与玻尔兹曼熵和吉布斯-香农熵不同,该方法无需预先设定宏观变量或概率系综,使其可适用于任意远离平衡态的场景。对于配备马尔可夫粗粒化的保测动力系统,我们证明了一系列涨落不等式,包括Jarzynski等式、兰道尔原理和热力学第二定律的算法版本。总体而言,算法熵决定了系统从单个状态实际做功的能力,而吉布斯-香农熵仅给出从先验已知状态系综做功的平均能力。