In class-incremental learning, an agent with limited resources needs to learn a sequence of classification tasks, forming an ever growing classification problem, with the constraint of not being able to access data from previous tasks. The main difference with task-incremental learning, where a task-ID is available at inference time, is that the learner also needs to perform cross-task discrimination, i.e. distinguish between classes that have not been seen together. Approaches to tackle this problem are numerous and mostly make use of an external memory (buffer) of non-negligible size. In this paper, we ablate the learning of cross-task features and study its influence on the performance of basic replay strategies used for class-IL. We also define a new forgetting measure for class-incremental learning, and see that forgetting is not the principal cause of low performance. Our experimental results show that future algorithms for class-incremental learning should not only prevent forgetting, but also aim to improve the quality of the cross-task features, and the knowledge transfer between tasks. This is especially important when tasks contain limited amount of data.
翻译:在类别增量学习中,一个资源有限的智能体需要学习一系列分类任务,形成一个不断增长的多分类问题,同时受到无法访问先前任务数据的约束。与推理时可获得任务ID的任务增量学习的主要区别在于,学习器还需要执行跨任务判别,即区分未曾同时出现过的类别。解决该问题的方法众多,且大多使用非小规模的外部记忆(缓冲区)。本文通过消融跨任务特征的学习,研究其对类别增量学习中基础回放策略性能的影响。同时,我们定义了类别增量学习的新遗忘度量,并发现遗忘并非性能低下的主要原因。实验结果表明,未来的类别增量学习算法不仅应防止遗忘,还应致力于提升跨任务特征的质量以及任务间的知识迁移能力。这在任务数据量有限时尤为重要。