Data are often sampled irregularly in time. Dealing with this using Recurrent Neural Networks (RNNs) traditionally involved ignoring the fact, feeding the time differences as additional inputs, or resampling the data. All these methods have their shortcomings. We propose an elegant straightforward alternative approach where instead the RNN is in effect resampled in time to match the time of the data or the task at hand. We use Echo State Network (ESN) and Gated Recurrent Unit (GRU) as the basis for our solution. Such RNNs can be seen as discretizations of continuous-time dynamical systems, which gives a solid theoretical ground to our approach. Our Task-Synchronized ESN (TSESN) and GRU (TSGRU) models allow for a direct model time setting and require no additional training, parameter tuning, or computation (solving differential equations or interpolating data) compared to their regular counterparts, thus retaining their original efficiency. We confirm empirically that our models can effectively compensate for the time-non-uniformity of the data and demonstrate that they compare favorably to data resampling, classical RNN methods, and alternative RNN models proposed to deal with time irregularities on several real-world nonuniform-time datasets. We open-source the code at https://github.com/oshapio/task-synchronized-RNNs .
翻译:数据在时间上往往以不规则方式采样。传统上使用循环神经网络(RNN)处理此问题的方法包括:忽略这一事实、将时间差作为额外输入馈送,或对数据进行重采样。所有这些方法均存在缺陷。我们提出一种优雅且直接的替代方案,其核心思想是将RNN在时间上重采样,以匹配数据的时间或当前任务需求。我们以回声状态网络(ESN)和门控循环单元(GRU)为基础构建解决方案。此类RNN可视为连续时间动力系统的离散化形式,这为我们的方法奠定了坚实的理论基础。我们的任务同步ESN(TSESN)与GRU(TSGRU)模型支持直接设置模型时间,且相较于常规模型,无需额外训练、参数调整或计算(求解微分方程或插值数据),从而保持了原有的效率。我们通过实证验证了所提模型能有效补偿数据的时间非均匀性,并在多个真实世界非均匀时间数据集上证明,其性能优于数据重采样、经典RNN方法以及为处理时间不规则性而提出的其他RNN模型。代码已开源:https://github.com/oshapio/task-synchronized-RNNs。