This paper introduces a novel information-theoretic perspective on the relationship between prominent group fairness notions in machine learning, namely statistical parity, equalized odds, and predictive parity. It is well known that simultaneous satisfiability of these three fairness notions is usually impossible, motivating practitioners to resort to approximate fairness solutions rather than stringent satisfiability of these definitions. However, a comprehensive analysis of their interrelations, particularly when they are not exactly satisfied, remains largely unexplored. Our main contribution lies in elucidating an exact relationship between these three measures of (un)fairness by leveraging a body of work in information theory called partial information decomposition (PID). In this work, we leverage PID to identify the granular regions where these three measures of (un)fairness overlap and where they disagree with each other leading to potential tradeoffs. We also include numerical simulations to complement our results.
翻译:本文提出了一种新颖的信息论视角,用于分析机器学习中主要群体公平性概念之间的关系,即统计均等、机会均等和预测均等。众所周知,同时满足这三个公平性概念通常是不可能的,这促使从业者转而寻求近似公平性解决方案,而非严格满足这些定义。然而,对于它们之间相互关系的全面分析,尤其是在这些概念未被精确满足的情况下,目前仍鲜有研究。我们的主要贡献在于,通过利用信息论中的一个分支——部分信息分解(PID),阐明了这三种(不)公平性度量之间的精确关系。在本研究中,我们利用PID来识别这三种(不)公平性度量相互重叠的细粒度区域,以及它们彼此分歧从而导致潜在权衡的区域。我们还通过数值模拟来补充我们的研究结果。