Linear-Time Graph Neural Networks for Scalable Recommendations

Linear-Time Graph Neural Networks for Scalable Recommendations

May 13–17, 2024, Singapore, Singapore | Jiahao Zhang, Rui Xue, Wenqi Fan, Xin Xu, Qing Li, Jian Pei, Xiaorui Liu
The paper "Linear-Time Graph Neural Networks for Scalable Recommendations" addresses the scalability issue of Graph Neural Networks (GNNs) in recommender systems. Despite the strong expressive power of GNNs in capturing high-order connectivities in user-item interaction data, their computational complexity often limits their practical application, especially in large-scale industrial settings. The authors propose a novel model called Linear-Time Graph Neural Networks (LTGNN) to achieve linear-time complexity while maintaining the expressive capabilities of GNNs. LTGNN introduces implicit graph modeling and an efficient variance-reduced neighbor sampling strategy. The implicit modeling leverages the fixed-point equation of Personalized PageRank (PPRP) to capture long-range dependencies with a single forward propagation layer, reducing the number of layers and computational complexity. The variance-reduced neighbor sampling technique approximates the gradient of historical embeddings, further improving efficiency. Experiments on three real-world datasets (Yelp2018, Alibaba-iFashion, and Amazon-Large) demonstrate that LTGNN achieves comparable or better recommendation performance compared to state-of-the-art GNN models like LightGCN, while being significantly more efficient. The proposed method shows great potential for industrial recommendation applications, as it balances prediction accuracy with computational efficiency.The paper "Linear-Time Graph Neural Networks for Scalable Recommendations" addresses the scalability issue of Graph Neural Networks (GNNs) in recommender systems. Despite the strong expressive power of GNNs in capturing high-order connectivities in user-item interaction data, their computational complexity often limits their practical application, especially in large-scale industrial settings. The authors propose a novel model called Linear-Time Graph Neural Networks (LTGNN) to achieve linear-time complexity while maintaining the expressive capabilities of GNNs. LTGNN introduces implicit graph modeling and an efficient variance-reduced neighbor sampling strategy. The implicit modeling leverages the fixed-point equation of Personalized PageRank (PPRP) to capture long-range dependencies with a single forward propagation layer, reducing the number of layers and computational complexity. The variance-reduced neighbor sampling technique approximates the gradient of historical embeddings, further improving efficiency. Experiments on three real-world datasets (Yelp2018, Alibaba-iFashion, and Amazon-Large) demonstrate that LTGNN achieves comparable or better recommendation performance compared to state-of-the-art GNN models like LightGCN, while being significantly more efficient. The proposed method shows great potential for industrial recommendation applications, as it balances prediction accuracy with computational efficiency.
Reach us at info@study.space