In dynamical systems reconstruction (DSR) we seek to infer from time series measurements a generative model of the underlying dynamical process. This is a prime objective in any scientific discipline, where we are particularly interested in parsimonious models with a low parameter load. A common strategy here is parameter pruning, removing all parameters with small weights. However, here we find this strategy does not work for DSR, where even low magnitude parameters can contribute considerably to the system dynamics. On the other hand, it is well known that many natural systems which generate complex dynamics, like the brain or ecological networks, have a sparse topology with comparatively few links. Inspired by this, we show that geometric pruning, where in contrast to magnitude-based pruning weights with a low contribution to an attractor's geometrical structure are removed, indeed manages to reduce parameter load substantially without significantly hampering DSR quality. We further find that the networks resulting from geometric pruning have a specific type of topology, and that this topology, and not the magnitude of weights, is what is most crucial to performance. We provide an algorithm that automatically generates such topologies which can be used as priors for generative modeling of dynamical systems by RNNs, and compare it to other well studied topologies like small-world or scale-free networks.
翻译:在动力学系统重构(DSR)中,我们试图从时间序列测量中推断出潜在动力学过程的生成模型。这是任何科学领域的一个首要目标,我们特别关注参数负载较低的简约模型。一种常见策略是参数剪枝,即移除所有权重较小的参数。然而,我们发现该策略不适用于DSR,因为即使幅值较低的参数也可能对系统动力学做出显著贡献。另一方面,众所周知,许多产生复杂动力学的自然系统(如大脑或生态网络)具有稀疏拓扑结构,其连接相对较少。受此启发,我们证明了几何剪枝(与基于幅值的剪枝相反,该方法移除对吸引子几何结构贡献较低的权重)确实能够大幅减少参数负载,而不会显著损害DSR质量。我们进一步发现,几何剪枝产生的网络具有特定类型的拓扑结构,并且正是这种拓扑结构(而非权重幅值)对性能最为关键。我们提出了一种算法,可自动生成此类拓扑结构,作为循环神经网络(RNN)对动力学系统进行生成建模的先验知识,并将其与其他深入研究的拓扑结构(如小世界网络或无标度网络)进行比较。