2024 | Jianan Zhou, Zhiguang Cao, Yaoxin Wu, Wen Song, Yining Ma, Jie Zhang, Chi Xu
This paper introduces MVMoE (Multi-Task Vehicle Routing Solver with Mixture-of-Experts), a unified neural solver designed to handle a wide range of Vehicle Routing Problem (VRP) variants simultaneously. The proposed method leverages mixture-of-experts (MoE) layers to enhance model capacity without proportionally increasing computational complexity. A hierarchical gating mechanism is developed to balance empirical performance and computational overhead. Experimental results demonstrate that MVMoE significantly improves zero-shot generalization on 10 unseen VRP variants and achieves decent performance on few-shot settings and real-world benchmark instances. Extensive studies on MoE configurations show the superiority of hierarchical gating in handling out-of-distribution data. The source code is available at <https://github.com/RoyalSkye/Routing-MVMoE>.This paper introduces MVMoE (Multi-Task Vehicle Routing Solver with Mixture-of-Experts), a unified neural solver designed to handle a wide range of Vehicle Routing Problem (VRP) variants simultaneously. The proposed method leverages mixture-of-experts (MoE) layers to enhance model capacity without proportionally increasing computational complexity. A hierarchical gating mechanism is developed to balance empirical performance and computational overhead. Experimental results demonstrate that MVMoE significantly improves zero-shot generalization on 10 unseen VRP variants and achieves decent performance on few-shot settings and real-world benchmark instances. Extensive studies on MoE configurations show the superiority of hierarchical gating in handling out-of-distribution data. The source code is available at <https://github.com/RoyalSkye/Routing-MVMoE>.