The exact common information between a set of random variables $X_1,...,X_n$ is defined as the minimum entropy of a shared random variable that allows for the exact distributive simulation of $X_1,...,X_n$. It has been established that, in certain instances, infinite entropy is required to achieve distributive simulation, suggesting that continuous random variables may be needed in such scenarios. However, to date, there is no established metric to characterize such cases. In this paper, we propose the concept of Common Information Dimension (CID) with respect to a given class of functions $\mathcal{F}$, defined as the minimum dimension of a random variable $W$ required to distributively simulate a set of random variables $X_1,...,X_n$, such that $W$ can be expressed as a function of $X_1,\cdots,X_n$ using a member of $\mathcal{F}$. Our main contributions include the computation of the common information dimension for jointly Gaussian random vectors in a closed form, with $\mathcal{F}$ being the linear functions class.
翻译:一组随机变量$X_1,...,X_n$的精确公共信息被定义为能够实现$X_1,...,X_n$精确分布式仿真的共享随机变量的最小熵。已有研究表明,在某些情况下,实现分布式仿真需要无限熵,这意味着此类场景可能需要使用连续随机变量。然而,迄今为止,尚无既定的度量标准来刻画此类情况。本文针对给定的函数类$\mathcal{F}$提出了公共信息维度(CID)的概念,其定义为:为分布式仿真一组随机变量$X_1,...,X_n$所需随机变量$W$的最小维度,且$W$可表示为$X_1,\cdots,X_n$通过$\mathcal{F}$中某成员函数映射的结果。我们的主要贡献包括:在$\mathcal{F}$为线性函数类的情况下,以闭合形式计算了联合高斯随机向量的公共信息维度。