In the 21st century, many of the crucial scientific and technical issues facing humanity can be understood as problems associated with understanding, modelling, and ultimately controlling complex systems: systems comprised of a large number of non-trivially interacting components whose collective behaviour can be difficult to predict. Information theory, a branch of mathematics historically associated with questions about encoding and decoding messages, has emerged as something of a lingua franca for those studying complex systems, far exceeding its original narrow domain of communication systems engineering. In the context of complexity science, information theory provides a set of tools which allow researchers to uncover the statistical and effective dependencies between interacting components; relationships between systems and their environment; mereological whole-part relationships; and is sensitive to non-linearities missed by commonly parametric statistical models. In this review, we aim to provide an accessible introduction to the core of modern information theory, aimed specifically at aspiring (and established) complex systems scientists. This includes standard measures, such as Shannon entropy, relative entropy, and mutual information, before building to more advanced topics, including: information dynamics, measures of statistical complexity, information decomposition, and effective network inference. In addition to detailing the formal definitions, in this review we make an effort to discuss how information theory can be interpreted and develop the intuition behind abstract concepts like "entropy," in the hope that this will enable interested readers to understand what information is, and how it is used, at a more fundamental level.
翻译:在21世纪,人类面临的许多关键科学与技术问题,都可被理解为理解、建模并最终控制复杂系统的问题:这些系统由大量非平凡相互作用的组分构成,其集体行为往往难以预测。信息论——这一历史上与编码和解码问题相关的数学分支——已成为研究复杂系统的某种通用语言,远远超出了其最初在通信系统工程中的狭窄领域。在复杂性科学的背景下,信息论提供了一套工具,使研究者能够揭示相互作用组分之间的统计与有效依赖性、系统与其环境之间的关系、部分与整体的构成关系,并对常见参数统计模型所忽略的非线性特征保持敏感。在本综述中,我们旨在为有志于(及已从事)复杂系统科学的研究者,提供一份易于理解的现代信息论核心导论。内容包括从香农熵、相对熵和互信息等标准度量出发,逐步深入更高级的主题,如:信息动力学、统计复杂性度量、信息分解以及有效网络推断。除了详述形式化定义外,本综述还着力探讨如何阐释信息论,并培养对"熵"等抽象概念的直觉理解,以期帮助感兴趣的读者在更基础的层面上理解信息是什么及其如何被运用。