While most continual learning methods focus on mitigating forgetting and improving accuracy, they often overlook the critical aspect of network calibration, despite its importance. Neural collapse, a phenomenon where last-layer features collapse to their class means, has demonstrated advantages in continual learning by reducing feature-classifier misalignment. Few works aim to improve the calibration of continual models for more reliable predictions. Our work goes a step further by proposing a novel method that not only enhances calibration but also improves performance by reducing overconfidence, mitigating forgetting, and increasing accuracy. We introduce Sphere-Adaptive Mixup (SAMix), an adaptive mixup strategy tailored for neural collapse-based methods. SAMix adapts the mixing process to the geometric properties of feature spaces under neural collapse, ensuring more robust regularization and alignment. Experiments show that SAMix significantly boosts performance, surpassing SOTA methods in continual learning while also improving model calibration. SAMix enhances both across-task accuracy and the broader reliability of predictions, making it a promising advancement for robust continual learning systems.
翻译:尽管大多数持续学习方法侧重于缓解遗忘并提升准确率,但它们往往忽视了网络校准这一关键方面,尽管其重要性不言而喻。神经坍缩现象——即最后一层特征坍缩至其类均值——通过减少特征与分类器的错位,已在持续学习中展现出优势。目前鲜有工作致力于提升持续学习模型的校准能力以获得更可靠的预测。我们的研究更进一步,提出了一种新颖方法,该方法不仅增强了校准能力,还通过降低过度自信、缓解遗忘并提高准确率来改善整体性能。我们引入了球面自适应混合(SAMix),这是一种专为基于神经坍缩的方法设计的自适应混合策略。SAMix根据神经坍缩下特征空间的几何特性调整混合过程,从而确保更鲁棒的正则化与对齐。实验表明,SAMix显著提升了性能,在持续学习中超越了现有最优方法,同时改善了模型校准。SAMix不仅提高了跨任务准确率,还增强了预测的广义可靠性,这使其成为构建鲁棒持续学习系统的一项前景广阔的技术进展。