Equivariant Graph Neural Networks (EGNNs) have demonstrated significant success in modeling microscale systems, including those in chemistry, biology and materials science. However, EGNNs face substantial computational challenges due to the high cost of constructing edge features via spherical tensor products, making them impractical for large-scale systems. To address this limitation, we introduce E2Former, an equivariant and efficient transformer architecture that incorporates the Wigner $6j$ convolution (Wigner $6j$ Conv). By shifting the computational burden from edges to nodes, the Wigner $6j$ Conv reduces the complexity from $O(|\mathcal{E}|)$ to $ O(| \mathcal{V}|)$ while preserving both the model's expressive power and rotational equivariance. We show that this approach achieves a 7x-30x speedup compared to conventional $\mathrm{SO}(3)$ convolutions. Furthermore, our empirical results demonstrate that the derived E2Former mitigates the computational challenges of existing approaches without compromising the ability to capture detailed geometric information. This development could suggest a promising direction for scalable and efficient molecular modeling.
翻译:等变图神经网络(EGNNs)在化学、生物学和材料科学等微观系统建模中已展现出显著成功。然而,由于通过球张量积构建边特征的高昂计算成本,EGNNs面临巨大的计算挑战,使其难以应用于大规模系统。为克服这一局限,我们提出了E2Former,一种结合了Wigner $6j$卷积(Wigner $6j$ Conv)的等变高效Transformer架构。通过将计算负担从边转移到节点,Wigner $6j$ Conv将复杂度从$O(|\mathcal{E}|)$降低至$O(|\mathcal{V}|)$,同时保持了模型的表达能力和旋转等变性。实验表明,相较于传统的$\mathrm{SO}(3)$卷积,该方法实现了7倍至30倍的加速。此外,我们的实证结果表明,所提出的E2Former在缓解现有方法计算挑战的同时,并未牺牲捕捉精细几何信息的能力。这一进展为可扩展且高效的分子建模指明了一个有前景的方向。