Variational Autoencoders (VAEs) are a powerful framework for learning latent representations of reduced dimensionality, while Neural ODEs excel in learning transient system dynamics. This work combines the strengths of both to generate fast surrogate models with adjustable complexity reacting on time-varying inputs signals. By leveraging the VAE's dimensionality reduction using a nonhierarchical prior, our method adaptively assigns stochastic noise, naturally complementing known NeuralODE training enhancements and enabling probabilistic time series modeling. We show that standard Latent ODEs struggle with dimensionality reduction in systems with time-varying inputs. Our approach mitigates this by continuously propagating variational parameters through time, establishing fixed information channels in latent space. This results in a flexible and robust method that can learn different system complexities, e.g. deep neural networks or linear matrices. Hereby, it enables efficient approximation of the Koopman operator without the need for predefining its dimensionality. As our method balances dimensionality reduction and reconstruction accuracy, we call it Balanced Neural ODE (B-NODE). We demonstrate the effectiveness of this methods on several academic and real-world test cases, e.g. a power plant or MuJoCo data.
翻译:变分自编码器(VAEs)是学习降维潜在表示的有力框架,而神经常微分方程(Neural ODEs)擅长学习瞬态系统动力学。本研究结合两者优势,构建能响应时变输入信号且复杂度可调的快速代理模型。通过利用VAE基于非层次先验的降维能力,本方法自适应分配随机噪声,自然地补充了已知的NeuralODE训练增强技术,并实现了概率时间序列建模。我们证明标准潜在ODE在处理时变输入系统的降维时存在困难。本方法通过随时间连续传播变分参数,在潜在空间中建立固定信息通道,从而缓解此问题。这产生了一种灵活鲁棒的方法,能够学习不同系统复杂度(例如深度神经网络或线性矩阵)。由此,该方法无需预先定义维度即可实现Koopman算子的高效逼近。由于本方法在降维与重构精度间取得平衡,我们称之为平衡神经常微分方程(B-NODE)。我们在多个学术与真实测试案例(如发电厂或MuJoCo数据)中验证了该方法的有效性。