2024 | Bohao Wang, Jiawei Chen, Changdong Li, Sheng Zhou, Qihao Shi, Yang Gao, Yan Feng, Chun Chen, Can Wang
The paper introduces Distributionally Robust GNN (DR-GNN), a novel method that integrates Distributionally Robust Optimization (DRO) into Graph Neural Networks (GNNs) to address distribution shifts in Recommender Systems (RS). GNNs, which capture high-order collaborative signals, often rely on the IID assumption, which is violated in real-world scenarios due to dynamic user preferences and data biases. DR-GNN tackles this issue by reinterpreting GNNs as graph smoothing regularizers and incorporating DRO to enhance robustness against distribution shifts. The method introduces slight perturbations to the training distribution and uses a graph edge-adding strategy to expand the support of the distribution. Theoretical analyses and extensive experiments validate the effectiveness of DR-GNN against three types of distribution shifts: popularity, temporal, and exposure shifts. The code is available at https://github.com/WANGBohaoO-jpg/DR-GNN.The paper introduces Distributionally Robust GNN (DR-GNN), a novel method that integrates Distributionally Robust Optimization (DRO) into Graph Neural Networks (GNNs) to address distribution shifts in Recommender Systems (RS). GNNs, which capture high-order collaborative signals, often rely on the IID assumption, which is violated in real-world scenarios due to dynamic user preferences and data biases. DR-GNN tackles this issue by reinterpreting GNNs as graph smoothing regularizers and incorporating DRO to enhance robustness against distribution shifts. The method introduces slight perturbations to the training distribution and uses a graph edge-adding strategy to expand the support of the distribution. Theoretical analyses and extensive experiments validate the effectiveness of DR-GNN against three types of distribution shifts: popularity, temporal, and exposure shifts. The code is available at https://github.com/WANGBohaoO-jpg/DR-GNN.