Updating diffusion models in an incremental setting would be practical in real-world applications yet computationally challenging. We present a novel learning strategy of Concept Neuron Selection (CNS), a simple yet effective approach to perform personalization in a continual learning scheme. CNS uniquely identifies neurons in diffusion models that are closely related to the target concepts. In order to mitigate catastrophic forgetting problems while preserving zero-shot text-to-image generation ability, CNS finetunes concept neurons in an incremental manner and jointly preserves knowledge learned of previous concepts. Evaluation of real-world datasets demonstrates that CNS achieves state-of-the-art performance with minimal parameter adjustments, outperforming previous methods in both single and multi-concept personalization works. CNS also achieves fusion-free operation, reducing memory storage and processing time for continual personalization.
翻译:在增量设置中更新扩散模型在实际应用中具有实用性,但计算上具有挑战性。我们提出了一种新颖的概念神经元选择学习策略,这是一种在持续学习方案中执行个性化的简单而有效的方法。CNS独特地识别出扩散模型中与目标概念密切相关的神经元。为了在保持零样本文本到图像生成能力的同时缓解灾难性遗忘问题,CNS以增量方式微调概念神经元,并共同保留先前学习到的概念知识。对真实世界数据集的评估表明,CNS以最少的参数调整实现了最先进的性能,在单概念和多概念个性化任务中均优于先前的方法。CNS还实现了无融合操作,减少了持续个性化的内存存储和处理时间。