The primary aim of Knowledge Graph embeddings (KGE) is to learn low-dimensional representations of entities and relations for predicting missing facts. While rotation-based methods like RotatE and QuatE perform well in KGE, they face two challenges: limited model flexibility requiring proportional increases in relation size with entity dimension, and difficulties in generalizing the model for higher-dimensional rotations. To address these issues, we introduce OrthogonalE, a novel KGE model employing matrices for entities and block-diagonal orthogonal matrices with Riemannian optimization for relations. This approach enhances the generality and flexibility of KGE models. The experimental results indicate that our new KGE model, OrthogonalE, is both general and flexible, significantly outperforming state-of-the-art KGE models while substantially reducing the number of relation parameters.
翻译:知识图谱嵌入(KGE)的主要目标是学习实体和关系的低维表示,以预测缺失事实。尽管基于旋转的方法(如RotatE和QuatE)在KGE中表现良好,但它们面临两个挑战:模型灵活性有限,要求关系维度随实体维度成比例增加;以及难以将模型推广到更高维度的旋转。为解决这些问题,我们提出了OrthogonalE,一种新颖的KGE模型,该模型使用矩阵表示实体,并采用块对角正交矩阵结合黎曼优化来处理关系。这一方法增强了KGE模型的通用性和灵活性。实验结果表明,我们新的KGE模型OrthogonalE既通用又灵活,在显著减少关系参数数量的同时,性能大幅优于当前最先进的KGE模型。