Few-shot class-incremental learning (FSCIL) aims to continuously recognize novel classes under limited data, which suffers from the key stability-plasticity dilemma: balancing the retention of old knowledge with the acquisition of new knowledge. To address this issue, we divide the task into two different stages and propose a framework termed Static-Dynamic Collaboration (SDC) to achieve a better trade-off between stability and plasticity. Specifically, our method divides the normal pipeline of FSCIL into Static Retaining Stage (SRS) and Dynamic Learning Stage (DLS), which harnesses old static and incremental dynamic class information, respectively. During SRS, we train an initial model with sufficient data in the base session and preserve the key part as static memory to retain fundamental old knowledge. During DLS, we introduce an extra dynamic projector jointly trained with the previous static memory. By employing both stages, our method achieves improved retention of old knowledge while continuously adapting to new classes. Extensive experiments on three public benchmarks and a real-world application dataset demonstrate that our method achieves state-of-the-art performance against other competitors.
翻译:小样本类增量学习(FSCIL)旨在数据受限条件下持续识别新类别,其面临稳定性-可塑性权衡这一核心困境:即如何在保持旧知识的同时有效获取新知识。为解决该问题,我们将任务划分为两个不同阶段,并提出一种称为静态-动态协作(SDC)的框架,以实现稳定性与可塑性间的更好权衡。具体而言,本方法将FSCIL的标准流程分解为静态保持阶段(SRS)与动态学习阶段(DLS),分别利用静态的旧类别信息与增量的动态类别信息。在SRS阶段,我们使用基础会话中的充足数据训练初始模型,并将关键部分保存为静态记忆以保留基础旧知识。在DLS阶段,我们引入额外的动态投影器与先前的静态记忆进行联合训练。通过协同运用这两个阶段,本方法在持续适应新类别的同时实现了对旧知识的更好保持。在三个公共基准数据集和一个实际应用数据集上的大量实验表明,本方法相较于其他竞争方法取得了最先进的性能。