Tipping points occur in many real-world systems, at which the system shifts suddenly from one state to another. The ability to predict the occurrence of tipping points from time series data remains an outstanding challenge and a major interest in a broad range of research fields. Particularly, the widely used methods based on bifurcation theory are neither reliable in prediction accuracy nor applicable for irregularly-sampled time series which are commonly observed from real-world systems. Here we address this challenge by developing a deep learning algorithm for predicting the occurrence of tipping points in untrained systems, by exploiting information about normal forms. Our algorithm not only outperforms traditional methods for regularly-sampled model time series but also achieves accurate predictions for irregularly-sampled model time series and empirical time series. Our ability to predict tipping points for complex systems paves the way for mitigation risks, prevention of catastrophic failures, and restoration of degraded systems, with broad applications in social science, engineering, and biology.
翻译:临界点广泛存在于现实世界的各类系统中,此时系统会突然从一种状态转变为另一种状态。基于时间序列数据预测临界点的发生,仍然是一个悬而未决的挑战,也是众多研究领域的主要关注点。特别是,基于分岔理论的广泛应用方法,其预测准确性既不可靠,也不适用于现实世界系统中常见的非均匀采样时间序列。在此,我们通过利用范式信息,开发了一种深度学习算法,用于预测未训练系统中临界点的发生,从而应对这一挑战。我们的算法不仅在均匀采样的模型时间序列上优于传统方法,而且对非均匀采样的模型时间序列和实证时间序列也能实现准确预测。我们预测复杂系统临界点的能力,为降低风险、预防灾难性故障以及恢复退化系统铺平了道路,在社会科学、工程学和生物学等领域具有广泛的应用前景。