Exemplar-free Class Incremental Learning (EFCIL) aims to learn from a sequence of tasks without having access to previous task data. In this paper, we consider the challenging Cold Start scenario in which insufficient data is available in the first task to learn a high-quality backbone. This is especially challenging for EFCIL since it requires high plasticity, resulting in feature drift which is difficult to compensate for in the exemplar-free setting. To address this problem, we propose an effective approach to consolidate feature representations by regularizing drift in directions highly relevant to previous tasks while employing prototypes to reduce task-recency bias. Our approach, which we call Elastic Feature Consolidation++ (EFC++) exploits a tractable second-order approximation of feature drift based on a proposed Empirical Feature Matrix (EFM). The EFM induces a pseudo-metric in feature space which we use to regularize feature drift in important directions and to update Gaussian prototypes. In addition, we introduce a post-training prototype re-balancing phase that updates classifiers to compensate for feature drift. Experimental results on CIFAR-100, Tiny-ImageNet, ImageNet-Subset, ImageNet-1K and DomainNet demonstrate that EFC++ is better able to learn new tasks by maintaining model plasticity and significantly outperforms the state-of-the-art.
翻译:无样本类增量学习(EFCIL)旨在无需访问先前任务数据的情况下从一系列任务中学习。本文考虑具有挑战性的冷启动场景,其中首个任务中可用数据不足以学习高质量骨干网络。这对EFCIL尤为困难,因为它需要高可塑性,导致特征漂移,而在无样本设置中难以补偿。为解决此问题,我们提出一种通过正则化与先前任务高度相关的方向上的漂移来整合特征表示的有效方法,同时采用原型减少任务近因偏差。我们提出的弹性特征整合++(EFC++)方法基于经验特征矩阵(EFM),利用特征漂移的可处理二阶近似。EFM在特征空间中诱导伪度量,我们借此正则化重要方向上的特征漂移并更新高斯原型。此外,我们引入训练后原型再平衡阶段,通过更新分类器补偿特征漂移。在CIFAR-100、Tiny-ImageNet、ImageNet-Subset、ImageNet-1K和DomainNet上的实验结果表明,EFC++能通过保持模型可塑性更好地学习新任务,并显著优于现有最优方法。