Internet of Things (IoT) deployments operate in nonstationary, dynamic environments where factors such as sensor drift, evolving user behavior, and heterogeneous user privacy requirements can affect application utility. Continual learning (CL) addresses this by adapting models over time without catastrophic forgetting. Meanwhile, contrastive learning has emerged as a powerful representation-learning paradigm that improves robustness and sample efficiency in a self-supervised manner. This paper reviews the usage of \emph{contrastive continual learning} (CCL) for IoT, connecting algorithmic design (replay, regularization, distillation, prompts) with IoT system realities (TinyML constraints, intermittent connectivity, privacy). We present a unifying problem formulation, derive common objectives that blend contrastive and distillation losses, propose an IoT-oriented reference architecture for on-device, edge, and cloud-based CCL, and provide guidance on evaluation protocols and metrics. Finally, we highlight open unique challenges with respect to the IoT domain, such as spanning tabular and streaming IoT data, concept drift, federated settings, and energy-aware training.
翻译:物联网(IoT)部署运行于非平稳的动态环境中,传感器漂移、用户行为演变以及异构用户隐私需求等因素均可能影响应用效能。持续学习(CL)通过使模型随时间适应而不发生灾难性遗忘来解决这一问题。与此同时,对比学习已成为一种强大的表示学习范式,能以自监督方式提升模型的鲁棒性和样本效率。本文综述了\emph{对比持续学习}(CCL)在物联网中的应用,将算法设计(回放、正则化、蒸馏、提示)与物联网系统现实(TinyML约束、间歇性连接、隐私)相结合。我们提出了统一的问题形式化表述,推导出融合对比损失与蒸馏损失的共同目标函数,设计了一种面向物联网的参考架构,支持设备端、边缘端及云端的CCL部署,并就评估协议与指标提供了指导。最后,我们重点指出了物联网领域特有的开放性挑战,例如涵盖表格与流式物联网数据、概念漂移、联邦学习场景以及能量感知训练等问题。