Foundation models for materials modeling are advancing quickly, but their training remains expensive, often placing state-of-the-art methods out of reach for many research groups. We introduce Nequix, a compact E(3)-equivariant potential that pairs a simplified NequIP design with modern training practices, including equivariant root-mean-square layer normalization and the Muon optimizer, to retain accuracy while substantially reducing compute requirements. Nequix has 700K parameters and was trained in 100 A100 GPU-hours. On the Matbench-Discovery and MDR Phonon benchmarks, Nequix ranks third overall while requiring a 20 times lower training cost than most other methods, and it delivers two orders of magnitude faster inference speed than the current top-ranked model. We release model weights and fully reproducible codebase at https://github.com/atomicarchitects/nequix.
翻译:材料建模的基础模型发展迅速,但其训练成本依然高昂,常使许多研究团队难以触及最先进的方法。我们提出了Nequix,一种紧凑的E(3)等变势能模型,它结合了简化的NequIP架构与现代训练策略——包括等变均方根层归一化与Muon优化器——在显著降低计算需求的同时保持了准确性。Nequix仅含70万个参数,并在100个A100 GPU小时内完成训练。在Matbench-Discovery与MDR Phonon基准测试中,Nequix综合排名第三,其训练成本比大多数其他方法低20倍,且推理速度比当前排名第一的模型快两个数量级。我们已在https://github.com/atomicarchitects/nequix 开源模型权重与完全可复现的代码库。