Recent works have demonstrated promising performances of neural networks on hyperbolic spaces and symmetric positive definite (SPD) manifolds. These spaces belong to a family of Riemannian manifolds referred to as symmetric spaces of noncompact type. In this paper, we propose a novel approach for developing neural networks on such spaces. Our approach relies on a unified formulation of the distance from a point to a hyperplane on the considered spaces. We show that some existing formulations of the point-to-hyperplane distance can be recovered by our approach under specific settings. Furthermore, we derive a closed-form expression for the point-to-hyperplane distance in higher-rank symmetric spaces of noncompact type equipped with G-invariant Riemannian metrics. The derived distance then serves as a tool to design fully-connected (FC) layers and an attention mechanism for neural networks on the considered spaces. Our approach is validated on challenging benchmarks for image classification, electroencephalogram (EEG) signal classification, image generation, and natural language inference.
翻译:近期研究表明,在双曲空间和对称正定(SPD)流形上的神经网络展现出优异的性能。这些空间属于一类被称为非紧型对称空间的黎曼流形。本文提出了一种在此类空间上构建神经网络的新方法。该方法基于所研究空间中点与超平面距离的统一表述。我们证明,在特定设定下,现有的一些点-超平面距离表述可通过我们的方法恢复。此外,我们推导出了配备G不变黎曼度量的高秩非紧型对称空间中点-超平面距离的闭式表达式。该距离公式随后被用作设计所研究空间上神经网络全连接(FC)层与注意力机制的工具。我们的方法在图像分类、脑电图(EEG)信号分类、图像生成和自然语言推理等具有挑战性的基准任务上得到了验证。