Many numerical algorithms in scientific computing -- particularly in areas like numerical linear algebra, PDE simulation, and inverse problems -- produce outputs that can be represented by semialgebraic functions; that is, the graph of the computed function can be described by finitely many polynomial equalities and inequalities. In this work, we introduce Semialgebraic Neural Networks (SANNs), a neural network architecture capable of representing any bounded semialgebraic function, and computing such functions up to the accuracy of a numerical ODE solver chosen by the programmer. Conceptually, we encode the graph of the learned function as the kernel of a piecewise polynomial selected from a class of functions whose roots can be evaluated using a particular homotopy continuation method. We show by construction that the SANN architecture is able to execute this continuation method, thus evaluating the learned semialgebraic function. Furthermore, the architecture can exactly represent even discontinuous semialgebraic functions by executing a continuation method on each connected component of the target function. Lastly, we provide example applications of these networks and show they can be trained with traditional deep-learning techniques.
翻译:科学计算中的许多数值算法——特别是在数值线性代数、偏微分方程模拟和反问题等领域——产生的输出可以用半代数函数表示;也就是说,计算所得函数的图像可以通过有限多个多项式等式与不等式来描述。本文提出半代数神经网络(SANNs),这是一种能够表示任意有界半代数函数,并可通过程序员选择的数值常微分方程求解器精度计算此类函数的神经网络架构。从概念上讲,我们将学习函数的图像编码为从一类函数中选取的分段多项式的核,该类函数的根可通过特定的同伦延拓法进行求值。我们通过构造证明SANN架构能够执行该延拓方法,从而实现对所学半代数函数的求值。此外,该架构通过对目标函数的每个连通分量执行延拓方法,甚至能精确表示不连续半代数函数。最后,我们提供了这些网络的应用示例,并证明其可通过传统深度学习技术进行训练。