We study the problem of exact sampling under an exponential communication cost, specifically Campbell's average codeword length $L(t)$ of order $t$, and Rényi's entropy. We provide a lower bound on the Campbell cost of exact sampling that grows approximately as $D_{1/α}(P||Q)$, the Rényi divergence of order $1/α$, with $α= \frac{1}{1+t}$. Using the Poisson functional representation of Li and El Gamal, we prove an upper bound on $L(t)$ whose leading Rényi divergence term has order within $ε$ of that of the lower bound. Our results reduce to the bounds of Harsha et al. as $α\to 1$. We also provide numerical examples comparing the bounds in the cases of normal and Laplacian distributions, demonstrating that the upper and lower bounds are typically within 5-10 bits of each other. Our results characterize exactly the optimal asymptotic Campbell cost $L(t)$ per sample as the number of independent and identically distributed (i.i.d.) samples grows to infinity. We show that under the exponential cost, any causal sampler performs strictly worse asymptotically than noncausal samplers. This contrasts with the case of expected message length, where both causal and noncausal samplers have the same optimal asymptotic cost.
翻译:暂无翻译