Recent studies have shown that deep neural networks are not well-calibrated and often produce over-confident predictions. The miscalibration issue primarily stems from using cross-entropy in classifications, which aims to align predicted softmax probabilities with one-hot labels. In ordinal regression tasks, this problem is compounded by an additional challenge: the expectation that softmax probabilities should exhibit unimodal distribution is not met with cross-entropy. The ordinal regression literature has focused on learning orders and overlooked calibration. To address both issues, we propose a novel loss function that introduces order-aware calibration, ensuring that prediction confidence adheres to ordinal relationships between classes. It incorporates soft ordinal encoding and order-aware regularization to enforce both calibration and unimodality. Extensive experiments across three popular ordinal regression benchmarks demonstrate that our approach achieves state-of-the-art calibration without compromising accuracy.
翻译:近期研究表明,深度神经网络存在校准不足的问题,常产生过度自信的预测。这种校准失准问题主要源于分类任务中使用的交叉熵损失函数,其目标是使预测的softmax概率与独热标签对齐。在序数回归任务中,该问题因另一挑战而加剧:交叉熵无法满足softmax概率应呈现单峰分布的预期。现有序数回归研究多聚焦于顺序学习而忽视了校准问题。为同时解决这两个问题,我们提出了一种新颖的损失函数,该函数引入顺序感知的校准机制,确保预测置信度遵循类别间的序数关系。该函数融合了软序数编码和顺序感知正则化,以同时保证校准性和单峰性。在三个主流序数回归基准上的大量实验表明,我们的方法在不牺牲准确性的前提下实现了最优的校准性能。