We extend the Rate-Distortion-Perception (RDP) framework to the Rényi information-theoretic regime, utilizing Sibson's $α$-mutual information to characterize the fundamental limits under distortion and perception constraints. For scalar Gaussian sources, we derive closed-form expressions for the Rényi RDP function, showing that the perception constraint induces a feasible interval for the reproduction variance. Furthermore, we establish a Rényi-generalized version of the Strong Functional Representation Lemma. Our analysis reveals a phase transition in the complexity of optimal functional representations: for $0.5<α< 1$, the coding cost is bounded by the $α$-divergence of order $α+1$, necessitating a codebook with heavy-tailed polynomial decay; conversely, for $α> 1$, the representation collapses to one with finite support, offering new insights into the compression of shared randomness under generalized notions of mutual information.
翻译:我们将率-失真-感知(RDP)框架扩展至Rényi信息论体系,利用Sibson的$α$-互信息来刻画失真与感知约束下的基本极限。针对标量高斯信源,我们推导了Rényi RDP函数的闭式表达式,表明感知约束会诱导出重建方差的可行区间。此外,我们建立了强函数表示引理的Rényi广义版本。分析揭示了最优函数表示复杂度的相变现象:当$0.5<α< 1$时,编码代价受$α+1$阶$α$-散度约束,需要具有重尾多项式衰减的码本;反之,当$α> 1$时,表示会坍缩为有限支撑集的形式,这为广义互信息概念下共享随机性的压缩问题提供了新的见解。