This work explores properties of Strong Data-Processing constants for Rényi Divergences. Parallels are made with the well-studied $\varphi$-Divergences, and it is shown that the order $α$ of Rényi Divergences dictates whether certain properties of the contraction of $\varphi$-Divergences are mirrored or not. In particular, we demonstrate that when $α>1$, the contraction properties can deviate quite strikingly from those of $\varphi$-Divergences. We also uncover specific characteristics of contraction for the $\infty$-Rényi Divergence and relate it to $\varepsilon$-Local Differential Privacy. The results are then applied to bound the speed of convergence of Markov chains, where we argue that the contraction of Rényi Divergences offers a new perspective on the contraction of $L^α$-norms commonly studied in the literature.
翻译:本研究探讨了Rényi散度的强数据处理常数特性。通过与已深入研究的$\varphi$-散度进行对比分析,揭示了Rényi散度的阶数$α$决定了$\varphi$-散度收缩特性的某些性质是否得以保持。特别地,我们证明当$α>1$时,其收缩特性可能与$\varphi$-散度存在显著差异。我们还发现了$\infty$-Rényi散度收缩的具体特征,并将其与$\varepsilon$-局部差分隐私建立联系。最后将研究结果应用于马尔可夫链收敛速度的界定,论证了Rényi散度的收缩性为文献中常见的$L^α$范数收缩提供了新的分析视角。