Learning to solve vehicle routing problems (VRPs) has garnered much attention. However, most neural solvers are only structured and trained independently on a specific problem, making them less generic and practical. In this paper, we aim to develop a unified neural solver that can cope with a range of VRP variants simultaneously. Specifically, we propose a multi-task vehicle routing solver with mixture-of-experts (MVMoE), which greatly enhances the model capacity without a proportional increase in computation. We further develop a hierarchical gating mechanism for the MVMoE, delivering a good trade-off between empirical performance and computational complexity. Experimentally, our method significantly promotes zero-shot generalization performance on 10 unseen VRP variants, and showcases decent results on the few-shot setting and real-world benchmark instances. We further conduct extensive studies on the effect of MoE configurations in solving VRPs, and observe the superiority of hierarchical gating when facing out-of-distribution data. The source code is available at: https://github.com/RoyalSkye/Routing-MVMoE.
翻译:学习求解车辆路径问题(VRP)已受到广泛关注。然而,大多数神经求解器仅针对特定问题独立构建和训练,导致其通用性和实用性不足。本文旨在开发一种能够同时处理多种VRP变体的统一神经求解器。具体而言,我们提出了一种基于混合专家系统(MVMoE)的多任务车辆路径求解器,该求解器在不显著增加计算量的前提下大幅提升了模型容量。我们进一步为MVMoE设计了层次化门控机制,在经验性能与计算复杂度之间实现了良好平衡。实验表明,该方法在10种未见过的VRP变体上显著提升了零样本泛化性能,并在小样本场景及真实世界基准实例上展现出理想效果。我们还系统研究了混合专家系统配置对求解VRP的影响,发现层次化门控在处理分布外数据时具有显著优势。源代码已开源:https://github.com/RoyalSkye/Routing-MVMoE。