2024 | Jianan Zhou, Zhiguang Cao, Yaoxin Wu, Wen Song, Yining Ma, Jie Zhang, Chi Xu
This paper introduces MVMoE, a multi-task vehicle routing solver with mixture-of-experts (MoE), which enhances model capacity without increasing computational complexity. The proposed method addresses the limitations of existing neural solvers that are tailored for specific VRP variants, leading to poor generalization. MVMoE employs a hierarchical gating mechanism to balance empirical performance and computational efficiency. It achieves strong zero-shot generalization on 10 unseen VRP variants and performs well in few-shot settings and real-world benchmarks. The method also explores the impact of MoE configurations on VRP solving, showing that hierarchical gating improves out-of-distribution generalization. Experiments demonstrate that MVMoE outperforms existing solvers in zero-shot and few-shot scenarios. The source code is available at https://github.com/RoyalSkye/Routing-MVMoE.This paper introduces MVMoE, a multi-task vehicle routing solver with mixture-of-experts (MoE), which enhances model capacity without increasing computational complexity. The proposed method addresses the limitations of existing neural solvers that are tailored for specific VRP variants, leading to poor generalization. MVMoE employs a hierarchical gating mechanism to balance empirical performance and computational efficiency. It achieves strong zero-shot generalization on 10 unseen VRP variants and performs well in few-shot settings and real-world benchmarks. The method also explores the impact of MoE configurations on VRP solving, showing that hierarchical gating improves out-of-distribution generalization. Experiments demonstrate that MVMoE outperforms existing solvers in zero-shot and few-shot scenarios. The source code is available at https://github.com/RoyalSkye/Routing-MVMoE.