The development of robust generative models for highly varied non-stationary time series data is a complex yet important problem. Traditional models for time series data prediction, such as Long Short-Term Memory (LSTM), are inefficient and generalize poorly as they cannot capture complex temporal relationships. In this paper, we present a probabilistic generative model that can be trained to capture temporal information, and that is robust to data errors. We call it Time Deep Latent Gaussian Model (tDLGM). Its novel architecture is inspired by Deep Latent Gaussian Model (DLGM). Our model is trained to minimize a loss function based on the negative log loss. One contributing factor to Time Deep Latent Gaussian Model (tDLGM) robustness is our regularizer, which accounts for data trends. Experiments conducted show that tDLGM is able to reconstruct and generate complex time series data, and that it is robust against to noise and faulty data.
翻译:针对高度变化的非平稳时序数据开发鲁棒的生成模型是一个复杂而重要的问题。传统的时序数据预测模型(如长短期记忆网络)效率低下且泛化能力差,因为它们无法捕捉复杂的时序关系。本文提出一种概率生成模型,该模型可通过训练捕捉时序信息,并对数据误差具有鲁棒性。我们将其称为时序深度隐高斯模型。其新颖架构受深度隐高斯模型的启发。我们的模型通过最小化基于负对数损失的损失函数进行训练。时序深度隐高斯模型鲁棒性的一个关键因素是我们设计了考虑数据趋势的正则化器。实验表明,tDLGM能够重建和生成复杂的时序数据,并对噪声和缺陷数据具有鲁棒性。