Strong data processing inequalities (SDPI) are an important object of study in Information Theory and have been well studied for $f$-divergences. Universal upper and lower bounds have been provided along with several applications, connecting them to impossibility (converse) results, concentration of measure, hypercontractivity, and so on. In this paper, we study R\'enyi divergence and the corresponding SDPI constant whose behavior seems to deviate from that of ordinary $\Phi$-divergences. In particular, one can find examples showing that the universal upper bound relating its SDPI constant to the one of Total Variation does not hold in general. In this work, we prove, however, that the universal lower bound involving the SDPI constant of the Chi-square divergence does indeed hold. Furthermore, we also provide a characterization of the distribution that achieves the supremum when $\alpha$ is equal to $2$ and consequently compute the SDPI constant for R\'enyi divergence of the general binary channel.
翻译:强数据处理不等式(SDPI)是信息论研究中的重要对象,并已在$f$-散度中得到了充分研究。已有研究提供了通用上下界及其多种应用,将其与不可能性(逆推)结果、测度集中、超压缩性等方面联系起来。本文研究了Rényi散度及其对应的SDPI常数,其行为似乎与普通$\Phi$-散度存在差异。具体而言,可以找到实例表明,将其SDPI常数与全变差SDPI常数关联的通用上界通常不成立。然而,我们证明涉及卡方散度SDPI常数的通用下界确实成立。此外,当$\alpha$等于2时,我们还给出了达到上确界的分布特征描述,并据此计算了一般二元信道的Rényi散度SDPI常数。