In recent years, few-shot learning problems have received a lot of attention. While methods in most previous works were trained and tested on datasets in one single domain, cross-domain few-shot learning is a brand-new branch of few-shot learning problems, where models handle datasets in different domains between training and testing phases. In this paper, to solve the problem that the model is pre-trained (meta-trained) on a single dataset while fine-tuned on datasets in four different domains, including common objects, satellite images, and medical images, we propose a novel large margin fine-tuning method (LMM-PQS), which generates pseudo query images from support images and fine-tunes the feature extraction modules with a large margin mechanism inspired by methods in face recognition. According to the experiment results, LMM-PQS surpasses the baseline models by a significant margin and demonstrates that our approach is robust and can easily adapt pre-trained models to new domains with few data.
翻译:近年来,少样本学习问题受到了广泛关注。虽然以往大多数方法在单一领域的数据集上进行训练和测试,但跨域少样本学习是少样本学习问题的一个全新分支,它要求模型在训练和测试阶段处理不同领域的数据集。本文针对模型在单一数据集上预训练(元训练),而在包含常见物体、卫星图像和医学图像等四个不同领域的数据集上进行微调的问题,提出了一种新型大间隔微调方法(LMM-PQS)。该方法通过从支持图像生成伪查询图像,并借鉴人脸识别方法中的大间隔机制对特征提取模块进行微调。实验结果表明,LMM-PQS以显著优势超越了基线模型,证明了该方法稳健性强,能够轻松将预训练模型适应到数据量少的新领域。