In relative entropy coding, a sender aims to design a stochastic code such that, on input $X \sim P_X$, the receiver can generate a sample $Y \sim P_{Y \mid X}$. It is a standard result that (1) this requires at least $I(X; Y)$ bits, (2) the lower bound is achievable within a logarithmic gap, and (3) this gap cannot be reduced in general. The necessity of the gap suggests that the mutual information is not the correct information measure to quantify the rate of relative entropy coding. A potential alternative emerged in the work of Flamich et al. (2025), who proved a tighter lower bound of $I_F(X \to Y)$, a quantity we call the functional information. In this paper, we show that this lower bound is tight by constructing the ring toss code, an encoding method for rejection sampling which uses at most $I_F(X \to Y) + \log e$ bits. For the trivial channel $Y = X$, our result recovers the noiseless source coding theorem within a small constant. For a general channel, it implies that the classical mutual information lower bound is achievable within $\log(I(X; Y) + 1) + 2.45$ bits in general and within $1.45$ bits for singular channels, which are both the tightest bounds of their kind to date. Moreover, our one-shot result also recovers Sriramu and Wagner's asymptotic results on the second-order redundancy of relative entropy codes.
翻译:在相对熵编码中,发送者旨在设计一种随机编码,使得在输入 $X \sim P_X$ 时,接收者能够生成样本 $Y \sim P_{Y \mid X}$。标准结果是:(1) 这至少需要 $I(X; Y)$ 比特,(2) 下界在对数间隙内可达,以及 (3) 该间隙通常无法缩小。该间隙的必要性表明互信息并非量化相对熵编码速率的正确信息度量。一种潜在的替代度量出现在 Flamich 等人 (2025) 的工作中,他们证明了更紧的下界 $I_F(X \to Y)$,我们称之为函数信息。在本文中,我们通过构造环投编码(一种用于拒绝采样的编码方法,最多使用 $I_F(X \to Y) + \log e$ 比特)来证明该下界是紧的。对于平凡信道 $Y = X$,我们的结果在小常数范围内恢复了无噪声信源编码定理。对于一般信道,这意味着经典互信息下界在通常情形下可在 $\log(I(X; Y) + 1) + 2.45$ 比特内实现,在奇异信道情形下可在 $1.45$ 比特内实现,这分别是当前同类结果中最紧的界。此外,我们的单次结果还恢复了 Sriramu 和 Wagner 关于相对熵编码二阶冗余度的渐近结果。