We study the input-entropy-constrained Gaussian channel capacity problem in the asymptotic high signal-to-noise ratio (SNR) regime. We show that the capacity-achieving distribution as SNR goes to infinity is given by a discrete Gaussian distribution supported on a scaled integer lattice. Further, we show that the gap between the input entropy and the capacity decreases to zero exponentially in SNR, and characterize this exponent.
翻译:本文研究了渐近高信噪比(SNR)条件下输入熵约束高斯信道的容量问题。我们证明,当信噪比趋于无穷大时,达到容量的最优输入分布为定义在缩放整数格上的离散高斯分布。进一步,我们证明了输入熵与信道容量之间的差距随信噪比呈指数衰减至零,并精确刻画了该指数衰减率。