Existing efforts are dedicated to designing many topologies and graph-aware strategies for the graph Transformer, which greatly improve the model's representation capabilities. However, manually determining the suitable Transformer architecture for a specific graph dataset or task requires extensive expert knowledge and laborious trials. This paper proposes an evolutionary graph Transformer architecture search framework (EGTAS) to automate the construction of strong graph Transformers. We build a comprehensive graph Transformer search space with the micro-level and macro-level designs. EGTAS evolves graph Transformer topologies at the macro level and graph-aware strategies at the micro level. Furthermore, a surrogate model based on generic architectural coding is proposed to directly predict the performance of graph Transformers, substantially reducing the evaluation cost of evolutionary search. We demonstrate the efficacy of EGTAS across a range of graph-level and node-level tasks, encompassing both small-scale and large-scale graph datasets. Experimental results and ablation studies show that EGTAS can construct high-performance architectures that rival state-of-the-art manual and automated baselines.
翻译:现有研究致力于为图Transformer设计多种拓扑结构和图感知策略,这些工作显著提升了模型的表征能力。然而,针对特定图数据集或任务,人工确定合适的Transformer架构需要大量专家知识和繁琐的试错过程。本文提出一种进化式图Transformer架构搜索框架(EGTAS),用于自动化构建强大的图Transformer。我们构建了一个包含微观设计与宏观设计的全面图Transformer搜索空间。EGTAS在宏观层面进化图Transformer拓扑结构,在微观层面进化图感知策略。此外,我们提出基于通用架构编码的代理模型,可直接预测图Transformer的性能,从而大幅降低进化搜索的评估成本。我们在涵盖小规模和大规模图数据集的一系列图级与节点级任务上验证了EGTAS的有效性。实验结果与消融研究表明,EGTAS能够构建出与最先进的人工设计和自动化基线相媲美的高性能架构。