In relative entropy coding, a sender aims to design a stochastic code such that, on input $X \sim P_X$, the receiver can generate a sample $Y \sim P_{Y \mid X}$. It is a standard result that (1) this requires at least $I(X; Y)$ bits, (2) the lower bound is achievable within a logarithmic gap, and (3) this gap cannot be reduced in general. The necessity of the gap suggests that the mutual information is not the correct information measure to quantify the rate of relative entropy coding. A potential alternative emerged in the work of Flamich et al. (2025), who proved a tighter lower bound of $I_F(X \to Y)$, a quantity we call the functional information. In this paper, we show that this lower bound is tight by constructing the ring toss code, an encoding method for rejection sampling which uses at most $I_F(X \to Y) + \log e$ bits. This demonstrates that rejection sampling is optimal for relative entropy coding. Our result implies that the classical mutual information lower bound is achievable within $\log(I(X; Y) + 1) + 2.45$ bits in general and within $1.45$ bits for singular channels, which are both the tightest bounds of their kind to date. Moreover, our one-shot result also recovers Sriramu and Wagner's asymptotic results on the second-order redundancy of relative entropy codes.
翻译:在相对熵编码中,发送方旨在设计一种随机编码,使得在输入 $X \sim P_X$ 时,接收方能够生成样本 $Y \sim P_{Y \mid X}$。标准结论表明:(1) 这至少需要 $I(X; Y)$ 比特;(2) 该下界在对数间隙内可达;(3) 此间隙通常无法缩小。该间隙的必然性表明互信息并非量化相对熵编码速率的正确信息度量。Flamich 等人(2025)的工作提出了一种潜在替代方案,他们证明了更紧的下界 $I_F(X \to Y)$,我们称之为函数信息。本文通过构建环投编码(一种用于拒绝采样的编码方法),证明该下界是紧的,其最多使用 $I_F(X \to Y) + \log e$ 比特。这表明拒绝采样是相对熵编码的最优方法。我们的结果意味着经典互信息下界在一般情况下可在 $\log(I(X; Y) + 1) + 2.45$ 比特内达到,在奇异信道中可在 $1.45$ 比特内达到,这两者均为迄今同类中最紧的界。此外,我们的单次结果还恢复了 Sriramu 与 Wagner 关于相对熵编码二阶冗余的渐近结果。