Knowledge Graph Convolutional Networks for Recommender Systems

Knowledge Graph Convolutional Networks for Recommender Systems

May 13–17, 2019, San Francisco, CA, USA | Hongwei Wang, Miao Zhao, Xing Xie, Wenjie Li, Minyi Guo
This paper proposes Knowledge Graph Convolutional Networks (KGCN) for recommender systems to address the sparsity and cold start problems in collaborative filtering. KGCN is an end-to-end framework that effectively captures inter-item relatedness by mining associated attributes in a knowledge graph (KG). It samples neighbors for each entity as a receptive field, combining neighborhood information with bias to calculate entity representations. The receptive field can be extended to multiple hops to model high-order proximity and capture long-distance user interests. KGCN is implemented in a minibatch fashion, enabling it to handle large datasets and KGs. The model is applied to three datasets: MovieLens-20M (movie), Book-Crossing (book), and Last.FM (music). Experimental results show that KGCN outperforms strong baselines, achieving average AUC gains of 4.4%, 8.1%, and 6.2% in movie, book, and music recommendations, respectively. KGCN extends graph convolutional networks (GCN) to KGs by aggregating and incorporating neighborhood information with bias. This design allows capturing both high-order structure and semantic information in the KG. The model uses three types of aggregators: sum, concat, and neighbor. The sum aggregator combines entity and neighborhood representations, the concat aggregator concatenates them, and the neighbor aggregator uses only the neighborhood representation. The model is trained using a minibatch approach, and the results show that KGCN outperforms baselines in both CTR prediction and top-K recommendation. The paper also discusses related work, including GCN, PinSage, and GAT. It highlights the importance of KGs in recommendation systems by leveraging their semantic and structural information. The results show that KGCN performs well in sparse scenarios, such as Book-Crossing and Last.FM, where KGs are more sparse than MovieLens-20M. The paper concludes that KGCN is effective in recommendation systems and suggests future work in non-uniform sampling, user-end KGs, and combining KGs at both ends.This paper proposes Knowledge Graph Convolutional Networks (KGCN) for recommender systems to address the sparsity and cold start problems in collaborative filtering. KGCN is an end-to-end framework that effectively captures inter-item relatedness by mining associated attributes in a knowledge graph (KG). It samples neighbors for each entity as a receptive field, combining neighborhood information with bias to calculate entity representations. The receptive field can be extended to multiple hops to model high-order proximity and capture long-distance user interests. KGCN is implemented in a minibatch fashion, enabling it to handle large datasets and KGs. The model is applied to three datasets: MovieLens-20M (movie), Book-Crossing (book), and Last.FM (music). Experimental results show that KGCN outperforms strong baselines, achieving average AUC gains of 4.4%, 8.1%, and 6.2% in movie, book, and music recommendations, respectively. KGCN extends graph convolutional networks (GCN) to KGs by aggregating and incorporating neighborhood information with bias. This design allows capturing both high-order structure and semantic information in the KG. The model uses three types of aggregators: sum, concat, and neighbor. The sum aggregator combines entity and neighborhood representations, the concat aggregator concatenates them, and the neighbor aggregator uses only the neighborhood representation. The model is trained using a minibatch approach, and the results show that KGCN outperforms baselines in both CTR prediction and top-K recommendation. The paper also discusses related work, including GCN, PinSage, and GAT. It highlights the importance of KGs in recommendation systems by leveraging their semantic and structural information. The results show that KGCN performs well in sparse scenarios, such as Book-Crossing and Last.FM, where KGs are more sparse than MovieLens-20M. The paper concludes that KGCN is effective in recommendation systems and suggests future work in non-uniform sampling, user-end KGs, and combining KGs at both ends.
Reach us at info@study.space