$H$-mutual information ($H$-MI) is a wide class of information leakage measures, where $H=(\eta, F)$ is a pair of monotonically increasing function $\eta$ and a concave function $F$, which is a generalization of Shannon entropy. $H$-MI is defined as the difference between the generalized entropy $H$ and its conditional version, including Shannon mutual information (MI), Arimoto MI of order $\alpha$, $g$-leakage, and expected value of sample information. This study presents a variational characterization of $H$-MI via statistical decision theory. Based on the characterization, we propose an alternating optimization algorithm for computing $H$-capacity.
翻译:$H$-互信息($H$-MI)是一类广泛的信息泄露度量,其中$H=(\eta, F)$由单调递增函数$\eta$和凹函数$F$构成,是香农熵的推广形式。$H$-MI定义为广义熵$H$与其条件版本之间的差值,其特例包括香农互信息(MI)、$\alpha$阶有本互信息、$g$-泄露以及样本信息期望值。本研究通过统计决策理论给出了$H$-MI的变分刻画。基于该刻画,我们提出了一种计算$H$-容量的交替优化算法。