The information bottleneck (IB) method seeks a compressed representation of data that preserves information relevant to a target variable for prediction while discarding irrelevant information from the original data. In its classical formulation, the IB method employs mutual information to evaluate the compression between the original and compressed data and the utility of the representation for the target variable. In this study, we investigate a generalized IB problem, where the evaluation of utility is based on the $\mathcal{H}$-mutual information that satisfies the concave (\texttt{CV}) and averaging (\texttt{AVG}) conditions. This class of information measures admits a statistical decision-theoretic interpretation via its equivalence to the expected value of sample information. Based on this interpretation, we derive an alternating optimization algorithm to assess the tradeoff between compression and utility in the generalized IB problem.
翻译:暂无翻译