The presence of symmetries imposes a stringent set of constraints on a system. This constrained structure allows intelligent agents interacting with such a system to drastically improve the efficiency of learning and generalization, through the internalisation of the system's symmetries into their information-processing. In parallel, principled models of complexity-constrained learning and behaviour make increasing use of information-theoretic methods. Here, we wish to marry these two perspectives and understand whether and in which form the information-theoretic lens can "see" the effect of symmetries of a system. For this purpose, we propose a novel variant of the Information Bottleneck principle, which has served as a productive basis for many principled studies of learning and information-constrained adaptive behaviour. We show (in the discrete case and under a specific technical assumption) that our approach formalises a certain duality between symmetry and information parsimony: namely, channel equivariances can be characterised by the optimal mutual information-preserving joint compression of the channel's input and output. This information-theoretic treatment furthermore suggests a principled notion of "soft" equivariance, whose "coarseness" is measured by the amount of input-output mutual information preserved by the corresponding optimal compression. This new notion offers a bridge between the field of bounded rationality and the study of symmetries in neural representations. The framework may also allow (exact and soft) equivariances to be automatically discovered.
翻译:对称性的存在对系统施加了严格的约束集。这种受约束的结构使得与此类系统交互的智能体能够通过将系统对称性内化到其信息处理中,从而显著提升学习与泛化的效率。与此同时,复杂度受限的学习与行为的原理性模型正日益广泛地采用信息论方法。本文旨在融合这两种视角,探究信息论视角是否以及以何种形式能够"观测"到系统对称性的效应。为此,我们提出信息瓶颈原理的一种新变体——该原理已成为众多学习与信息受限自适应行为原理性研究的有效基础。我们证明(在离散情形下及特定技术假设下)该方法形式化了对称性与信息简约性之间的某种对偶关系:即信道等变性可通过信道输入与输出的最优互信息保持联合压缩来表征。这种信息论处理方式进一步提出了"软"等变性的原理性概念,其"粗糙度"由相应最优压缩所保留的输入-输出互信息量度量。这一新概念为有限理性领域与神经表征对称性研究架起了桥梁。该框架亦可能实现(精确与软)等变性的自动发现。