Contrastive learning has significantly improved representation quality, enhancing knowledge transfer across tasks in continual learning (CL). However, catastrophic forgetting remains a key challenge, as contrastive based methods primarily focus on "soft relationships" or "softness" between samples, which shift with changing data distributions and lead to representation overlap across tasks. Recently, the newly identified Neural Collapse phenomenon has shown promise in CL by focusing on "hard relationships" or "hardness" between samples and fixed prototypes. However, this approach overlooks "softness", crucial for capturing intra-class variability, and this rigid focus can also pull old class representations toward current ones, increasing forgetting. Building on these insights, we propose Focal Neural Collapse Contrastive (FNC2), a novel representation learning loss that effectively balances both soft and hard relationships. Additionally, we introduce the Hardness-Softness Distillation (HSD) loss to progressively preserve the knowledge gained from these relationships across tasks. Our method outperforms state-of-the-art approaches, particularly in minimizing memory reliance. Remarkably, even without the use of memory, our approach rivals rehearsal-based methods, offering a compelling solution for data privacy concerns.
翻译:对比学习显著提升了表征质量,增强了持续学习中跨任务的知识迁移能力。然而,灾难性遗忘仍是关键挑战,因为基于对比的方法主要关注样本间的“软关系”或“软性”,这些关系会随数据分布变化而偏移,导致跨任务表征重叠。近期新发现的神经坍缩现象通过聚焦样本与固定原型间的“硬关系”或“硬性”,在持续学习中展现出潜力。但该方法忽视了对于捕捉类内变异至关重要的“软性”,且这种刚性关注可能将旧类表征拉向当前表征,加剧遗忘。基于这些发现,我们提出焦点神经坍缩对比损失,这是一种新颖的表征学习损失函数,能有效平衡软硬关系。此外,我们引入硬软蒸馏损失,以逐步保留跨任务中从这些关系获得的知识。本方法在最小化内存依赖方面显著优于现有先进方法。值得注意的是,即使不使用记忆库,我们的方法仍可与基于复现的方法相媲美,为数据隐私问题提供了引人注目的解决方案。