Gaussian process state-space models (GPSSMs) have emerged as a powerful framework for modeling dynamical systems, offering interpretable uncertainty quantification and inherent regularization. However, existing GPSSMs face significant challenges in handling high-dimensional, non-stationary systems due to computational inefficiencies, limited scalability, and restrictive stationarity assumptions. In this paper, we propose an efficient transformed Gaussian process state-space model (ETGPSSM) to address these limitations. Our approach leverages a single shared Gaussian process (GP) combined with normalizing flows and Bayesian neural networks, enabling efficient modeling of complex, high-dimensional state transitions while preserving scalability. To address the lack of closed-form expressions for the implicit process in the transformed GP, we follow its generative process and introduce an efficient variational inference algorithm, aided by the ensemble Kalman filter (EnKF), to enable computationally tractable learning and inference. Extensive empirical evaluations on synthetic and real-world datasets demonstrate the superior performance of our ETGPSSM in system dynamics learning, high-dimensional state estimation, and time-series forecasting, outperforming existing GPSSMs and neural network-based methods in both accuracy and computational efficiency.
翻译:高斯过程状态空间模型(GPSSM)已成为建模动态系统的一个强大框架,提供了可解释的不确定性量化和固有的正则化特性。然而,现有的GPSSM在处理高维、非平稳系统时面临重大挑战,这源于计算效率低下、可扩展性有限以及严格的平稳性假设。本文提出一种高效的变换高斯过程状态空间模型(ETGPSSM)以解决这些局限性。我们的方法利用单个共享高斯过程(GP)结合标准化流和贝叶斯神经网络,从而能够高效地建模复杂的高维状态转移,同时保持可扩展性。为了解决变换GP中隐过程缺乏闭式表达的问题,我们遵循其生成过程,并引入一种高效的变分推断算法,辅以集合卡尔曼滤波器(EnKF),以实现计算上可行的学习与推断。在合成和真实世界数据集上进行的大量实证评估表明,我们的ETGPSSM在系统动力学学习、高维状态估计和时间序列预测方面均表现出优越性能,在准确性和计算效率上均优于现有的GPSSM和基于神经网络的方法。