Contrastive learning has shown to be effective to learn representations from time series in a self-supervised way. However, contrasting similar time series instances or values from adjacent timestamps within a time series leads to ignore their inherent correlations, which results in deteriorating the quality of learned representations. To address this issue, we propose SoftCLT, a simple yet effective soft contrastive learning strategy for time series. This is achieved by introducing instance-wise and temporal contrastive loss with soft assignments ranging from zero to one. Specifically, we define soft assignments for 1) instance-wise contrastive loss by the distance between time series on the data space, and 2) temporal contrastive loss by the difference of timestamps. SoftCLT is a plug-and-play method for time series contrastive learning that improves the quality of learned representations without bells and whistles. In experiments, we demonstrate that SoftCLT consistently improves the performance in various downstream tasks including classification, semi-supervised learning, transfer learning, and anomaly detection, showing state-of-the-art performance. Code is available at this repository: https://github.com/seunghan96/softclt.
翻译:对比学习已被证明在自监督方式下能有效学习时间序列表示。然而,对相似时间序列实例或时间序列中相邻时间戳的值进行对比,会忽略其内在相关性,导致所学表示质量下降。为解决该问题,我们提出SoftCLT,一种简单且有效的软对比学习策略。该方法通过引入实例级和时间级对比损失,并采用从0到1的软赋值实现。具体地,我们定义:1) 基于数据空间时间序列距离的实例级对比损失软赋值,2) 基于时间戳差异的时间级对比损失软赋值。SoftCLT是一种即插即用的时间序列对比学习方法,无需额外技巧即可提升表示质量。实验表明,SoftCLT在分类、半监督学习、迁移学习及异常检测等多种下游任务中持续提升性能,达到最先进水平。代码开源地址:https://github.com/seunghan96/softclt。