Recent advances in knowledge graph embedding (KGE) rely on Euclidean/hyperbolic orthogonal relation transformations to model intrinsic logical patterns and topological structures. However, existing approaches are confined to rigid relational orthogonalization with restricted dimension and homogeneous geometry, leading to deficient modeling capability. In this work, we move beyond these approaches in terms of both dimension and geometry by introducing a powerful framework named GoldE, which features a universal orthogonal parameterization based on a generalized form of Householder reflection. Such parameterization can naturally achieve dimensional extension and geometric unification with theoretical guarantees, enabling our framework to simultaneously capture crucial logical patterns and inherent topological heterogeneity of knowledge graphs. Empirically, GoldE achieves state-of-the-art performance on three standard benchmarks. Codes are available at https://github.com/xxrep/GoldE.
翻译:近年来知识图谱嵌入(KGE)的研究进展多依赖于欧几里得/双曲空间中的正交关系变换来建模内在逻辑模式与拓扑结构。然而现有方法受限于刚性关系正交化,存在维度受限与几何同质化问题,导致建模能力不足。本文通过引入名为GoldE的强大框架,在维度与几何两个层面突破了现有方法的限制。该框架基于广义Householder反射形式实现了通用正交参数化,该参数化能自然实现维度扩展与几何统一,并具有理论保证,使框架能同时捕捉知识图谱中关键的逻辑模式与固有的拓扑异质性。实验表明,GoldE在三个标准基准上取得了最优性能。代码已开源至https://github.com/xxrep/GoldE。