The Ingleton inequality is a classical linear information inequality that holds for representable matroids but fails to be universally valid for entropic vectors. Understanding the extent to which this inequality can be violated has been a longstanding problem in information theory. In this paper, we show that for a broad class of jointly distributed random variables $(X,Y)$ the Ingleton inequality holds up to a small additive error, even even though the mutual information between $X$ and $Y$ is far from being extractable. Contrary to common intuition, strongly non-extractable mutual information does not lead to large violations of the Ingleton inequality in this setting. More precisely, we consider pairs $(X,Y)$ that are uniformly distributed on their joint support and whose associated biregular bipartite graph is an expander. For all auxiliary random variables $A$ and $B$ jointly distributed with $(X,Y)$, we establish a lower bound on the Ingleton quantity $I(X:Y | A) + I(X:Y | B) + I(A:B) - I(X:Y)$ in terms of the spectral parameters of the underlying graph. Our proof combines the expander mixing lemma with a partitioning technique for finite sets.
翻译:英格尔顿不等式是线性信息论中的一个经典不等式,它在可表示拟阵中成立,但对于一般熵向量并不普遍有效。理解该不等式被违反的程度一直是信息论中长期存在的问题。本文证明,对于一大类联合分布的随机变量$(X,Y)$,即使$X$与$Y$之间的互信息远非可提取的,英格尔顿不等式仍能以较小的加性误差成立。与通常的直觉相反,在此设定下,强不可提取的互信息并不会导致英格尔顿不等式被严重违反。更精确地说,我们考虑在联合支撑集上均匀分布且其关联双正则二分图为扩展图的随机变量对$(X,Y)$。对于所有与$(X,Y)$联合分布的辅助随机变量$A$和$B$,我们建立了英格尔顿量$I(X:Y | A) + I(X:Y | B) + I(A:B) - I(X:Y)$的下界,该下界由底层图的谱参数表示。我们的证明将扩展图混合引理与有限集的划分技术相结合。