A mathematical framework for information-theoretic analysis is established, with a new viewpoint of describing transmitted messages and communication channels by the nonlinear expectation theory, beyond the framework of classical probability theory. The major motivation of this research is to emphasize the probabilistic distribution uncertainty within the ever increasingly complex communication networks, where random phenomena are often nonstationary, heterogeneous, and cannot be characterized by a single probability distribution. Based on the nonlinear expectation theory, in this paper we first explicitly define several fundamental concepts, such as nonlinear information entropy, nonlinear joint entropy, nonlinear conditional entropy and nonlinear mutual information, and establish their basic properties. Secondly, by using the strong law of large numbers under sublinear expectations, we propose a nonlinear source coding theorem, which shows that the nonlinear information entropy is the upper bound of the achievable coding rate of sources whose distributions are uncertain under the maximum error probability criterion, and determines a cluster point of the coding rate of such sources under the minimum error probability criterion. Thirdly, we propose a nonlinear channel coding theorem, which gives the explicit expression of the upper bound under the maximum error probability criterion and a cluster point under the minimum error probability criterion, respectively, for the achievable coding rate of communication channels whose distributions are uncertain. Additionally, we propose a nonlinear rate-distortion source coding theorem, proving that the rate distortion function based on the nonlinear mutual information is a cluster point of the lossy compression performance of uncertain-distribution sources under the minimum expected distortion criterion.
翻译:本文建立了一个信息论分析的数学框架,通过非线性期望理论这一新视角来描述传输消息和通信信道,超越了经典概率论的框架。本研究的主要动机在于强调日益复杂的通信网络中存在的概率分布不确定性,其中的随机现象往往是非平稳、异质的,无法用单一概率分布来刻画。基于非线性期望理论,本文首先明确定义了若干基本概念,如非线性信息熵、非线性联合熵、非线性条件熵和非线性互信息,并建立了它们的基本性质。其次,利用次线性期望下的强大数定律,我们提出了非线性信源编码定理,该定理表明非线性信息熵是在最大错误概率准则下分布不确定信源可达编码速率的上界,并确定了在最小错误概率准则下此类信源编码速率的聚点。再次,我们提出了非线性信道编码定理,分别给出了在最大错误概率准则下分布不确定通信信道可达编码速率的上界显式表达式,以及在最小错误概率准则下的一个聚点。此外,我们提出了非线性率失真信源编码定理,证明了基于非线性互信息的率失真函数是在最小期望失真准则下分布不确定信源有损压缩性能的一个聚点。