The development of robust generative models for highly varied non-stationary time series data is a complex yet important problem. Traditional models for time series data prediction, such as Long Short-Term Memory (LSTM), are inefficient and generalize poorly as they cannot capture complex temporal relationships. In this paper, we present a probabilistic generative model that can be trained to capture temporal information, and that is robust to data errors. We call it Time Deep Latent Gaussian Model (tDLGM). Its novel architecture is inspired by Deep Latent Gaussian Model (DLGM). Our model is trained to minimize a loss function based on the negative log loss. One contributing factor to Time Deep Latent Gaussian Model (tDLGM) robustness is our regularizer, which accounts for data trends. Experiments conducted show that tDLGM is able to reconstruct and generate complex time series data, and that it is robust against to noise and faulty data.
翻译:针对高度变化的非平稳时序数据开发鲁棒生成模型是一个复杂且重要的问题。传统的时序数据预测模型,如长短期记忆网络(LSTM),由于无法捕捉复杂的时序关系,效率低下且泛化能力较差。本文提出一种概率生成模型,该模型可通过训练捕捉时序信息,并对数据错误具有鲁棒性。我们将其称为时序深度隐高斯模型(tDLGM)。其新颖架构受深度隐高斯模型(DLGM)启发。我们的模型通过最小化基于负对数损失的损失函数进行训练。时序深度隐高斯模型(tDLGM)鲁棒性的一个关键因素是我们考虑了数据趋势的正则化项。实验表明,tDLGM能够重建和生成复杂的时序数据,并且对噪声和错误数据具有鲁棒性。