We introduce liquid-resistance liquid-capacitance neural networks (LRCs), a neural-ODE model which considerably improves the smoothness, accuracy, and biological plausibility of electrical equivalent circuits (EECs), liquid time-constant networks (LTCs), and saturated liquid time-constant networks (STCs), respectively. We also introduce LRC units (LRCUs), as a very efficient and accurate gated RNN-model, which results from solving LRCs with an explicit Euler scheme using just one unfolding. We empirically show and formally prove that the liquid capacitance of LRCs considerably dampens the oscillations of LTCs and STCs, while at the same time dramatically increasing accuracy even for cheap solvers. We experimentally demonstrate that LRCs are a highly competitive alternative to popular neural ODEs and gated RNNs in terms of accuracy, efficiency, and interpretability, on classic time-series benchmarks and a complex autonomous-driving lane-keeping task.
翻译:我们提出了液态电阻-液态电容神经网络(LRCs),这是一种神经常微分方程模型,能显著提升电学等效电路(EECs)、液态时间常数网络(LTCs)以及饱和液态时间常数网络(STCs)的平滑性、准确性和生物合理性。我们还引入了LRC单元(LRCUs),作为一种高效且准确的门控循环神经网络模型,它通过仅使用一次展开的显式欧拉法求解LRCs得到。我们通过实验展示并严格证明,LRCs的液态电容显著抑制了LTCs和STCs的振荡,同时即使使用低成本求解器也能大幅提升准确性。我们在经典时间序列基准测试和复杂的自动驾驶车道保持任务上,通过实验验证了LRCs在准确性、效率和可解释性方面是流行的神经常微分方程和门控循环神经网络极具竞争力的替代方案。