We investigate the descriptive complexity of a class of neural networks with unrestricted topologies and piecewise polynomial activation functions. We consider the general scenario where the networks run for an unlimited number of rounds and floating-point numbers are used to simulate reals. We characterize these neural networks with a recursive rule-based logic for Boolean networks. In particular, we show that the sizes of the neural networks and the corresponding Boolean rule formulae are polynomially related. In fact, in the direction from Boolean rules to neural networks, the blow-up is only linear. Our translations result in a time delay, which is the number of rounds that it takes for an object's translation to simulate a single round of the object. In the translation from neural networks to Boolean rules, the time delay of the resulting formula is polylogarithmic in the neural network size. In the converse translation, the time delay of the neural network is linear in the formula size. As a corollary, by restricting our logic, we obtain a similar characterization for classical feedforward neural networks. We also obtain translations between the rule-based logic for Boolean networks, the diamond-free fragment of modal substitution calculus and a class of recursive Boolean circuits where the number of input and output gates match. Ultimately, our translations offer a method of translating a given neural network into an equivalent neural network with different activation functions, including linear activation functions!
翻译:本文研究一类具有无限制拓扑结构和分段多项式激活函数的神经网络的描述复杂性。我们考虑网络运行轮数不受限制且使用浮点数模拟实数的通用场景。通过为布尔网络设计一种基于规则的递归逻辑,我们对这些神经网络进行了形式化刻画。特别地,我们证明了神经网络规模与对应的布尔规则公式规模之间存在多项式关联。实际上,在从布尔规则到神经网络的方向上,规模膨胀仅为线性。我们的转换方法会产生时间延迟,即被转换对象模拟原对象单轮运行所需的轮数。在从神经网络到布尔规则的转换中,所得公式的时间延迟相对于网络规模呈多对数关系;而在反向转换中,神经网络的时间延迟相对于公式规模呈线性关系。作为推论,通过对逻辑进行约束,我们为经典前馈神经网络获得了类似的刻画结果。我们还建立了布尔网络规则逻辑、模态替换演算的无菱形片段,以及输入输出门数量匹配的递归布尔电路类之间的转换关系。最终,我们的转换方法提供了一种将给定神经网络转换为具有不同激活函数(包括线性激活函数)的等效神经网络的系统化途径。