Federated Class Incremental Learning (FCIL) is a new direction in continual learning (CL) for addressing catastrophic forgetting and non-IID data distribution simultaneously. Existing FCIL methods call for high communication costs and exemplars from previous classes. We propose a novel rehearsal-free method for FCIL named prototypes-injected prompt (PIP) that involves 3 main ideas: a) prototype injection on prompt learning, b) prototype augmentation, and c) weighted Gaussian aggregation on the server side. Our experiment result shows that the proposed method outperforms the current state of the arts (SOTAs) with a significant improvement (up to 33%) in CIFAR100, MiniImageNet and TinyImageNet datasets. Our extensive analysis demonstrates the robustness of PIP in different task sizes, and the advantage of requiring smaller participating local clients, and smaller global rounds. For further study, source codes of PIP, baseline, and experimental logs are shared publicly in https://github.com/anwarmaxsum/PIP.
翻译:联邦类增量学习(FCIL)是持续学习(CL)领域的新方向,旨在同时解决灾难性遗忘与非独立同分布数据问题。现有FCIL方法需要高昂的通信成本及历史类别的样本存储。本文提出一种无需样本回放的FCIL新方法——原型注入提示(PIP),其包含三个核心创新:a)提示学习中的原型注入机制,b)原型增强策略,c)服务器端的加权高斯聚合方法。实验结果表明,在CIFAR100、MiniImageNet和TinyImageNet数据集上,所提方法以显著优势(最高提升33%)超越当前最优方法。深入分析验证了PIP在不同任务规模下的鲁棒性,以及在减少参与客户端数量与全局训练轮次方面的优势。为促进后续研究,PIP的源代码、基线模型及实验日志已公开于https://github.com/anwarmaxsum/PIP。