Should prediction models always deliver a prediction? In the pursuit of maximum predictive performance, critical considerations of reliability and fairness are often overshadowed, particularly when it comes to the role of uncertainty. Selective regression, also known as the "reject option," allows models to abstain from predictions in cases of considerable uncertainty. Initially proposed seven decades ago, approaches to selective regression have mostly focused on distribution-based proxies for measuring uncertainty, particularly conditional variance. However, this focus neglects the significant influence of model-specific biases on a model's performance. In this paper, we propose a novel approach to selective regression by leveraging conformal prediction, which provides grounded confidence measures for individual predictions based on model-specific biases. In addition, we propose a standardized evaluation framework to allow proper comparison of selective regression approaches. Via an extensive experimental approach, we demonstrate how our proposed approach, conformalized selective regression, demonstrates an advantage over multiple state-of-the-art baselines.
翻译:预测模型是否应始终提供预测?在追求最大预测性能的过程中,可靠性与公平性的关键考量常被忽视,尤其涉及不确定性作用时。选择性回归,亦称“拒绝选项”,允许模型在不确定性较高时放弃预测。该方法于七十年前首次提出,其实现途径多集中于基于分布的不确定性度量代理指标,尤其是条件方差。然而,此种侧重忽略了模型特定偏差对性能的重大影响。本文提出一种基于保形预测的新型选择性回归方法,该方法能依据模型特定偏差为个体预测提供有理论依据的置信度量。此外,我们构建了标准化评估框架以实现选择性回归方法的合理比较。通过大量实验验证,我们提出的保形化选择性回归方法在多项先进基线对比中展现出显著优势。