Few-shot class-incremental learning (FSCIL) poses significant challenges for artificial neural networks due to the need to efficiently learn from limited data while retaining knowledge of previously learned tasks. Inspired by the brain's mechanisms for categorization and analogical learning, we propose a novel approach called Brain-inspired Analogical Mixture Prototypes (BAMP). BAMP has three components: mixed prototypical feature learning, statistical analogy, and soft voting. Starting from a pre-trained Vision Transformer (ViT), mixed prototypical feature learning represents each class using a mixture of prototypes and fine-tunes these representations during the base session. The statistical analogy calibrates the mean and covariance matrix of prototypes for new classes according to similarity to the base classes, and computes classification score with Mahalanobis distance. Soft voting combines both merits of statistical analogy and an off-shelf FSCIL method. Our experiments on benchmark datasets demonstrate that BAMP outperforms state-of-the-art on both traditional big start FSCIL setting and challenging small start FSCIL setting. The study suggests that brain-inspired analogical mixture prototypes can alleviate catastrophic forgetting and over-fitting problems in FSCIL.
翻译:少样本类增量学习(FSCIL)对人工神经网络提出了重大挑战,因为它需要从有限数据中高效学习,同时保留先前学习任务的知识。受大脑分类和类比学习机制的启发,我们提出了一种名为脑启发类比混合原型(BAMP)的新方法。BAMP包含三个组成部分:混合原型特征学习、统计类比和软投票。从预训练的Vision Transformer(ViT)出发,混合原型特征学习使用一组原型混合表示每个类别,并在基础会话期间微调这些表示。统计类比根据与基础类别的相似性校准新类别原型的均值和协方差矩阵,并使用马氏距离计算分类分数。软投票结合了统计类比与现成FSCIL方法的优点。我们在基准数据集上的实验表明,BAMP在传统大起点FSCIL设置和具有挑战性的小起点FSCIL设置上均优于现有最先进方法。该研究表明,脑启发的类比混合原型可以缓解FSCIL中的灾难性遗忘和过拟合问题。