Fully tensorial theory of hypercomplex neural networks is given. It allows neural networks to use arithmetic based on arbitrary algebras. The key point is to observe that algebra multiplication can be represented as a rank three tensor and use this tensor in every algebraic operation. This approach is attractive for neural network libraries that support effective tensorial operations. It agrees with previous implementations for four-dimensional algebras. The proof of Universal Approximation Theorem for tensor formalism was given.
翻译:本文提出了超复数神经网络的全张量理论。该理论使神经网络能够使用基于任意代数的算术运算。其核心在于将代数乘法表示为三阶张量,并在所有代数运算中使用该张量。该方法对于支持高效张量运算的神经网络库具有显著优势。该理论与四维代数的既有实现方案完全兼容。文中同时给出了张量形式下通用逼近定理的证明。