Contrastive learning has shown to be effective to learn representations from time series in a self-supervised way. However, contrasting similar time series instances or values from adjacent timestamps within a time series leads to ignore their inherent correlations, which results in deteriorating the quality of learned representations. To address this issue, we propose SoftCLT, a simple yet effective soft contrastive learning strategy for time series. This is achieved by introducing instance-wise and temporal contrastive loss with soft assignments ranging from zero to one. Specifically, we define soft assignments for 1) instance-wise contrastive loss by the distance between time series on the data space, and 2) temporal contrastive loss by the difference of timestamps. SoftCLT is a plug-and-play method for time series contrastive learning that improves the quality of learned representations without bells and whistles. In experiments, we demonstrate that SoftCLT consistently improves the performance in various downstream tasks including classification, semi-supervised learning, transfer learning, and anomaly detection, showing state-of-the-art performance. Code is available at this repository: https://github.com/seunghan96/softclt.
翻译:对比学习已被证明能有效以自监督方式从时间序列中学习表示。然而,对比相似的时间序列实例或时间序列内相邻时间戳的数值会忽略其内在关联,导致学习表示的质量下降。为解决这一问题,我们提出SoftCLT,一种简单而有效的时间序列软对比学习策略。该方法通过引入实例级和时间对比损失实现,其软分配范围从零到一。具体而言,我们通过以下方式定义软分配:1) 实例级对比损失依据数据空间中时间序列间的距离;2) 时间对比损失依据时间戳的差异。SoftCLT是一种即插即用的时间序列对比学习方法,无需复杂修饰即可提升学习表示的质量。实验表明,SoftCLT在分类、半监督学习、迁移学习和异常检测等多种下游任务中持续提升性能,展现了最先进的性能。代码已在此仓库公开:https://github.com/seunghan96/softclt。