We study online conformal prediction for non-stationary data streams subject to unknown distribution drift. While most prior work studied this problem under adversarial settings and/or assessed performance in terms of gaps of time-averaged marginal coverage, we instead evaluate performance through training-conditional cumulative regret. We specifically focus on independently generated data with two types of distribution shift: abrupt change points and smooth drift. When non-conformity score functions are pretrained on an independent dataset, we propose a split-conformal style algorithm that leverages drift detection to adaptively update calibration sets, which provably achieves minimax-optimal regret. When non-conformity scores are instead trained online, we develop a full-conformal style algorithm that again incorporates drift detection to handle non-stationarity; this approach relies on stability - rather than permutation symmetry - of the model-fitting algorithm, which is often better suited to online learning under evolving environments. We establish non-asymptotic regret guarantees for our online full conformal algorithm, which match the minimax lower bound under appropriate restrictions on the prediction sets. Numerical experiments corroborate our theoretical findings.
翻译:本研究针对受未知分布漂移影响的非平稳数据流,探讨在线共形预测问题。现有研究多基于对抗性设定,或使用时均边缘覆盖差距评估性能,而本文则通过训练条件累积遗憾来衡量性能。我们特别关注具有两类分布漂移的独立生成数据:突变点和连续漂移。当非共形评分函数在独立数据集上预训练时,我们提出一种分割共形风格算法,该算法利用漂移检测自适应更新校准集,可证明达到极小极大最优遗憾。当非共形评分采用在线训练时,我们开发了全共形风格算法,该算法同样整合漂移检测以处理非平稳性;该方法依赖于模型拟合算法的稳定性(而非置换对称性),这通常更适用于动态演化环境下的在线学习。我们为在线全共形算法建立了非渐近遗憾保证,在预测集满足适当约束条件下,该结果与极小极大下界相匹配。数值实验验证了理论结论。