Conformal prediction offers a powerful framework for building distribution-free prediction intervals for exchangeable data. Existing methods that extend conformal prediction to sequential data rely on fitting a relatively complex model to capture temporal dependencies. However, these methods can fail if the sample size is small and often require expensive retraining when the underlying data distribution changes. To overcome these limitations, we propose Reservoir Conformal Prediction (ResCP), a novel training-free conformal prediction method for time series. Our approach leverages the efficiency and representation learning capabilities of reservoir computing to dynamically reweight conformity scores. In particular, we compute similarity scores among reservoir states and use them to adaptively reweight the observed residuals at each step. With this approach, ResCP enables us to account for local temporal dynamics when modeling the error distribution without compromising computational scalability. We prove that, under reasonable assumptions, ResCP achieves asymptotic conditional coverage, and we empirically demonstrate its effectiveness across diverse forecasting tasks.
翻译:保形预测为可交换数据提供了构建无分布预测区间的强大框架。现有将保形预测扩展到序列数据的方法依赖于拟合相对复杂的模型来捕捉时间依赖性。然而,当样本量较小时,这些方法可能失效,并且当底层数据分布发生变化时通常需要昂贵的重新训练。为克服这些限制,我们提出了储层保形预测(ResCP),一种新颖的免训练时间序列保形预测方法。我们的方法利用储层计算的高效性和表征学习能力来动态重加权保形分数。具体而言,我们计算储层状态之间的相似性分数,并利用它们自适应地对每一步观测到的残差进行重加权。通过这种方法,ResCP使我们能够在建模误差分布时考虑局部时间动态,同时不损害计算可扩展性。我们证明,在合理假设下,ResCP能够实现渐近条件覆盖,并通过实证研究展示了其在多种预测任务中的有效性。