Custom Diffusion Models (CDMs) offer impressive capabilities for personalization in generative modeling, yet they remain vulnerable to catastrophic forgetting when learning new concepts sequentially. Existing approaches primarily focus on minimizing interference between concepts, often neglecting the potential for positive inter-concept interactions. In this work, we present Forget Less by Learning from Parents (FLLP), a novel framework that introduces a parent-child inter-concept learning mechanism in hyperbolic space to mitigate forgetting. By embedding concept representations within a Lorentzian manifold, naturally suited to modeling tree-like hierarchies, we define parent-child relationships in which previously learned concepts serve as guidance for adapting to new ones. Our method not only preserves prior knowledge but also supports continual integration of new concepts. We validate FLLP on three public datasets and one synthetic benchmark, showing consistent improvements in both robustness and generalization.
翻译:定制扩散模型在生成建模个性化方面展现出卓越能力,但在连续学习新概念时仍易遭受灾难性遗忘。现有方法主要聚焦于最小化概念间干扰,往往忽视了概念间正向交互的潜力。本研究提出"通过向父概念学习以缓解遗忘"的新框架,在双曲空间引入父子概念间学习机制来减轻遗忘。通过将概念表征嵌入适于建模树状层级结构的洛伦兹流形,我们定义了父子关系——已习得概念作为适应新概念的指导依据。该方法不仅能保留先验知识,还能支持新概念的持续整合。我们在三个公共数据集和一个合成基准上验证了该框架,结果表明其在鲁棒性和泛化能力方面均获得持续提升。