Continual learning remains constrained by the need for repeated retraining, high computational costs, and the persistent challenge of forgetting. These factors significantly limit the applicability of continuous learning in real-world settings, as iterative model updates require significant computational resources and inherently exacerbate forgetting. We present SAILS -- Segment Anything with Incrementally Learned Semantics, a training-free framework for Class-Incremental Semantic Segmentation (CISS) that sidesteps these challenges entirely. SAILS leverages foundational models to decouple CISS into two stages: Zero-shot region extraction using Segment Anything Model (SAM), followed by semantic association through prototypes in a fixed feature space. SAILS incorporates selective intra-class clustering, resulting in multiple prototypes per class to better model intra-class variability. Our results demonstrate that, despite requiring no incremental training, SAILS typically surpasses the performance of existing training-based approaches on standard CISS datasets, particularly in long and challenging task sequences where forgetting tends to be most severe. By avoiding parameter updates, SAILS completely eliminates forgetting and maintains consistent, task-invariant performance. Furthermore, SAILS exhibits positive backward transfer, where the introduction of new classes can enhance performance on previous classes.
翻译:持续学习仍受限于需要重复训练、计算成本高昂以及持续存在的遗忘问题。这些因素极大地限制了持续学习在现实场景中的应用,因为迭代式的模型更新需要大量计算资源,并且本质上会加剧遗忘。我们提出了SAILS——通过增量学习语义实现任务不变且免训练的持续学习,这是一个用于类增量语义分割的免训练框架,能够完全规避上述挑战。SAILS利用基础模型将类增量语义分割解耦为两个阶段:首先使用Segment Anything Model进行零样本区域提取,随后在固定特征空间中通过原型进行语义关联。SAILS引入了选择性类内聚类机制,为每个类别生成多个原型,从而更好地建模类内差异。实验结果表明,尽管无需增量训练,SAILS在标准类增量语义分割数据集上通常超越现有基于训练的方法,尤其是在遗忘问题最严重的长期且具有挑战性的任务序列中。通过避免参数更新,SAILS完全消除了遗忘现象,并保持了稳定且任务不变的性能。此外,SAILS展现出正向的后向迁移能力,即新类别的引入能够提升对先前类别的识别性能。