Complex networks, which are the abstractions of many real-world systems, present a persistent challenge across disciplines for people to decipher their underlying information. Recently, hyperbolic geometry of latent spaces has gained traction in network analysis, due to its ability to preserve certain local intrinsic properties of the nodes. In this study, we explore the problem from a much broader perspective: understanding the impact of nodes' global topological structures on latent space placements. Our investigations reveal a direct correlation between the topological structure of nodes and their positioning within the latent space. Building on this deep and strong connection between node distance and network topology, we propose a novel embedding framework called Topology-encoded Latent Hyperbolic Geometry (TopoLa) for analyzing complex networks. With the encoded topological information in the latent space, TopoLa is capable of enhancing both conventional and low-rank networks, using the singular value gap to clarify the mathematical principles behind this enhancement. Meanwhile, we show that the equipped TopoLa distance can also help augment pivotal deep learning models encompassing knowledge distillation and contrastive learning.
翻译:复杂网络作为众多现实世界系统的抽象表示,其隐含信息的解析一直是跨学科领域面临的持续性挑战。近年来,隐空间的双曲几何因其能够保持节点局部内在特性的能力,在网络分析中受到广泛关注。本研究从一个更广阔的视角探讨该问题:探究节点全局拓扑结构对隐空间布局的影响。我们的研究发现,节点的拓扑结构与其在隐空间中的位置存在直接关联。基于节点距离与网络拓扑之间这种深刻而紧密的联系,我们提出了一种名为拓扑编码隐式双曲几何(Topology-encoded Latent Hyperbolic Geometry,TopoLa)的新型嵌入框架,用于分析复杂网络。通过在隐空间中编码拓扑信息,TopoLa能够增强传统网络与低秩网络的性能,并利用奇异值间隙阐明这种增强背后的数学原理。同时,我们证明所配备的TopoLa距离也有助于增强包括知识蒸馏与对比学习在内的关键深度学习模型。