Few-shot class-incremental learning (FSCIL) aims to continually fit new classes with limited training data, while maintaining the performance of previously learned classes. The main challenges are overfitting the rare new training samples and forgetting old classes. While catastrophic forgetting has been extensively studied, the overfitting problem has attracted less attention in FSCIL. To tackle overfitting challenge, we design a new ensemble model framework cooperated with data augmentation to boost generalization. In this way, the enhanced model works as a library storing abundant features to guarantee fast adaptation to downstream tasks. Specifically, the multi-input multi-output ensemble structure is applied with a spatial-aware data augmentation strategy, aiming at diversifying the feature extractor and alleviating overfitting in incremental sessions. Moreover, self-supervised learning is also integrated to further improve the model generalization. Comprehensive experimental results show that the proposed method can indeed mitigate the overfitting problem in FSCIL, and outperform the state-of-the-art methods.
翻译:少样本类增量学习旨在利用有限的训练数据持续适应新类别,同时保持对已学类别的性能。其主要挑战在于对稀缺的新训练样本过拟合以及遗忘旧类别。尽管灾难性遗忘已得到广泛研究,但过拟合问题在少样本类增量学习中受到的关注较少。为解决过拟合挑战,我们设计了一种与数据增强协同工作的新型集成模型框架,以提升泛化能力。通过这种方式,增强后的模型可作为存储丰富特征的库,确保快速适应下游任务。具体而言,多输入多输出集成结构结合了空间感知数据增强策略,旨在多样化特征提取器并缓解增量会话中的过拟合问题。此外,自监督学习也被集成进来以进一步改善模型泛化能力。综合实验结果表明,所提方法确实能够缓解少样本类增量学习中的过拟合问题,并优于现有最先进方法。