Custom Diffusion Models (CDMs) have gained significant attention due to their remarkable ability to personalize generative processes. However, existing CDMs suffer from catastrophic forgetting when continuously learning new concepts. Most prior works attempt to mitigate this issue under the sequential learning setting with a fixed order of concept inflow and neglect inter-concept interactions. In this paper, we propose a novel framework - Forget Less by Learning Together (FL2T) - that enables concurrent and order-agnostic concept learning while addressing catastrophic forgetting. Specifically, we introduce a set-invariant inter-concept learning module where proxies guide feature selection across concepts, facilitating improved knowledge retention and transfer. By leveraging inter-concept guidance, our approach preserves old concepts while efficiently incorporating new ones. Extensive experiments, across three datasets, demonstrates that our method significantly improves concept retention and mitigates catastrophic forgetting, highlighting the effectiveness of inter-concept catalytic behavior in incremental concept learning of ten tasks with at least 2% gain on average CLIP Image Alignment scores.
翻译:定制扩散模型因其卓越的个性化生成能力而受到广泛关注。然而,现有的定制扩散模型在持续学习新概念时存在灾难性遗忘问题。先前的大多数研究试图在概念流入顺序固定的序列学习设置下缓解此问题,却忽视了概念间的相互作用。本文提出一种新颖的框架——通过协同学习以减少遗忘——该框架支持并发且与顺序无关的概念学习,同时解决灾难性遗忘问题。具体而言,我们引入了一个集合不变的概念间学习模块,其中代理指导跨概念的特征选择,从而促进知识保留与迁移的改善。通过利用概念间引导,我们的方法在高效融入新概念的同时,能够有效保留旧概念。在三个数据集上进行的大量实验表明,我们的方法显著提升了概念保留能力并缓解了灾难性遗忘,突显了概念间催化行为在增量概念学习中的有效性,在十项任务的平均CLIP图像对齐分数上至少获得了2%的提升。