Self-supervised models have been shown to produce comparable or better visual representations than their supervised counterparts when trained offline on unlabeled data at scale. However, their efficacy is catastrophically reduced in a Continual Learning (CL) scenario where data is presented to the model sequentially. In this paper, we show that self-supervised loss functions can be seamlessly converted into distillation mechanisms for CL by adding a predictor network that maps the current state of the representations to their past state. This enables us to devise a framework for Continual self-supervised visual representation Learning that (i) significantly improves the quality of the learned representations, (ii) is compatible with several state-of-the-art self-supervised objectives, and (iii) needs little to no hyperparameter tuning. We demonstrate the effectiveness of our approach empirically by training six popular self-supervised models in various CL settings.


翻译:自我监督模型显示,在对受监督的对应方进行大规模无标签数据离线培训时,自我监督模型比受监督的对应方产生可比或更好的视觉表现,但是,在连续学习假设中,其效力在连续学习假设中被灾难性地降低,数据按顺序向模型提供。在本文中,我们显示,自我监督的损失功能可以通过添加一个预测网络,将演示当前状况映射到过去状态,从而无缝地转化为CL的蒸馏机制。这使我们能够设计一个持续自我监督的视觉表现学习框架,以便(一) 显著提高学习的演示质量,(二) 符合一些最先进的自我监督目标,(三) 几乎不需要超参数调整。我们通过培训不同CL环境中的六种受监督模型,以经验方式展示了我们的方法的有效性。

1
下载
关闭预览

相关内容

让 iOS 8 和 OS X Yosemite 无缝切换的一个新特性。 > Apple products have always been designed to work together beautifully. But now they may really surprise you. With iOS 8 and OS X Yosemite, you’ll be able to do more wonderful things than ever before.

Source: Apple - iOS 8
最新《自监督表示学习》报告,70页ppt
专知会员服务
86+阅读 · 2020年12月22日
100+篇《自监督学习(Self-Supervised Learning)》论文最新合集
专知会员服务
167+阅读 · 2020年3月18日
Transferring Knowledge across Learning Processes
CreateAMind
29+阅读 · 2019年5月18日
强化学习的Unsupervised Meta-Learning
CreateAMind
18+阅读 · 2019年1月7日
Unsupervised Learning via Meta-Learning
CreateAMind
44+阅读 · 2019年1月3日
Hierarchical Disentangled Representations
CreateAMind
4+阅读 · 2018年4月15日
Continual Unsupervised Representation Learning
Arxiv
7+阅读 · 2019年10月31日
VIP会员
相关VIP内容
相关资讯
Transferring Knowledge across Learning Processes
CreateAMind
29+阅读 · 2019年5月18日
强化学习的Unsupervised Meta-Learning
CreateAMind
18+阅读 · 2019年1月7日
Unsupervised Learning via Meta-Learning
CreateAMind
44+阅读 · 2019年1月3日
Hierarchical Disentangled Representations
CreateAMind
4+阅读 · 2018年4月15日
Top
微信扫码咨询专知VIP会员