We introduce a two-parameter family of discrepancy measures, termed \emph{$(G,f)$-divergences}, obtained by applying a non-decreasing function $G$ to an $f$-divergence $D_f$. Building on Csiszár's formulation of mutual $f$-information, we define a corresponding $(G,f)$-information measure $ I_{G,f}(X;Y)$. A central theme of the paper is subadditivity over product distributions and product channels. We develop reduction principles showing that, for broad classes of $G$, it suffices to verify divergence subadditivity on binary alphabets. Specializing to the functions $G(x)\in\{x,\log(1+x),-\log(1-x)\}$, we derive tractable sufficient conditions on $f$ that guarantee subadditivity, covering many standard $f$-divergences. Finally, we present applications to finite-blocklength converses for channel coding, bounds in binary hypothesis testing, and an extension of the Shannon--Gallager--Berlekamp sphere-packing exponent framework to subadditive $(G,f)$-divergences.
翻译:我们引入了一个双参数族的差异度量,称为\emph{$(G,f)$-散度},通过对$f$-散度$D_f$应用一个非递减函数$G$得到。基于Csiszár关于互$f$-信息的表述,我们定义了相应的$(G,f)$-信息测度$I_{G,f}(X;Y)$。本文的一个核心主题是乘积分布与乘积信道上的次可加性。我们发展了归约原理,表明对于广泛的$G$函数类,只需在二元字母表上验证散度的次可加性即可。特别地,针对函数$G(x)\in\{x,\log(1+x),-\log(1-x)\}$,我们推导了关于$f$的易于处理的充分条件,以保证次可加性成立,这涵盖了许多标准的$f$-散度。最后,我们展示了其在信道编码有限码长逆定理、二元假设检验中的界,以及将Shannon–Gallager–Berlekamp球面堆积指数框架推广到次可加的$(G,f)$-散度等方面的应用。