Learning to solve vehicle routing problems (VRPs) has garnered much attention. However, most neural solvers are only structured and trained independently on a specific problem, making them less generic and practical. In this paper, we aim to develop a unified neural solver that can cope with a range of VRP variants simultaneously. Specifically, we propose a multi-task vehicle routing solver with mixture-of-experts (MVMoE), which greatly enhances the model capacity without a proportional increase in computation. We further develop a hierarchical gating mechanism for the MVMoE, delivering a good trade-off between empirical performance and computational complexity. Experimentally, our method significantly promotes the zero-shot generalization performance on 10 unseen VRP variants, and showcases decent results on the few-shot setting and real-world benchmark instances. We further provide extensive studies on the effect of MoE configurations in solving VRPs. Surprisingly, the hierarchical gating can achieve much better out-of-distribution generalization performance. The source code is available at: https://github.com/RoyalSkye/Routing-MVMoE.
翻译:学习解决车辆路径问题(VRPs)已引起广泛关注。然而,大多数神经求解器仅针对特定问题独立构建和训练,导致其缺乏通用性和实用性。本文旨在开发一种能够同时处理多种VRP变体的统一神经求解器。具体而言,我们提出了一种基于混合专家(Mixture-of-Experts, MoE)的多任务车辆路径求解器(MVMoE),该方法在不使计算量成比例增加的情况下显著提升了模型容量。我们进一步为MVMoE设计了层次化门控机制,在实证性能与计算复杂度之间实现了良好权衡。实验表明,我们的方法在10种未见过的VRP变体上显著提升了零样本泛化性能,并在小样本设置和真实世界基准实例上展现了可观的结果。我们还对MoE配置在解决VRPs中的效果进行了广泛研究。令人惊讶的是,层次化门控能够实现更优异的分布外泛化性能。源代码地址为:https://github.com/RoyalSkye/Routing-MVMoE。