Looped Transformers have shown exceptional neural algorithmic reasoning capability in simulating traditional graph algorithms, but their application to more complex structures like hypergraphs remains underexplored. Hypergraphs generalize graphs by modeling higher-order relationships among multiple entities, enabling richer representations but introducing significant computational challenges. In this work, we extend the Loop Transformer architecture's neural algorithmic reasoning capability to simulate hypergraph algorithms, addressing the gap between neural networks and combinatorial optimization over hypergraphs. Specifically, we propose a novel degradation mechanism for reducing hypergraphs to graph representations, enabling the simulation of graph-based algorithms, such as Dijkstra's shortest path. Furthermore, we introduce a hyperedge-aware encoding scheme to simulate hypergraph-specific algorithms, exemplified by Helly's algorithm. We establish theoretical guarantees for these simulations, demonstrating the feasibility of processing high-dimensional and combinatorial data using Loop Transformers. This work highlights the potential of Transformers as general-purpose algorithmic solvers for structured data.
翻译:循环Transformer在模拟传统图算法方面展现出卓越的神经算法推理能力,但其在超图等更复杂结构上的应用仍待深入探索。超图通过建模多个实体间的高阶关系对图进行了泛化,虽能实现更丰富的表示,但也带来了显著的计算挑战。本研究将循环Transformer架构的神经算法推理能力扩展至超图算法模拟,以弥合神经网络与超图组合优化之间的鸿沟。具体而言,我们提出了一种新颖的超图降级机制,可将超图转化为图表示,从而实现对图基算法(如Dijkstra最短路径算法)的模拟。此外,我们引入了一种超边感知编码方案,用于模拟超图特定算法(以Helly算法为例)。我们为这些模拟建立了理论保证,证明了使用循环Transformer处理高维组合数据的可行性。本工作彰显了Transformer作为结构化数据通用算法求解器的潜力。