16 Aug 2017 | Xiangnan He Hanwang Zhang Min-Yen Kan Tat-Seng Chua
This paper presents improvements in both effectiveness and efficiency of Matrix Factorization (MF) methods for implicit feedback. The authors address two critical issues in existing works: (1) the uniform weighting of missing data, which is invalid in real-world settings, and (2) the offline nature of most methods, which fails to handle the dynamic nature of online data. To address these issues, the authors propose an item popularity-aware weighting scheme for missing data, which is more effective and flexible than the uniform-weight assumption. They also design a new learning algorithm based on the element-wise Alternating Least Squares (eALS) technique to efficiently optimize the MF model with variably-weighted missing data. This efficiency allows for an incremental update strategy that instantly refreshes the model given new feedback. Through experiments on two public datasets in both offline and online protocols, the authors show that their eALS method consistently outperforms state-of-the-art implicit MF methods. The implementation is available at https://github.com/hexiangnan/sigir16-eals. The key contributions include: (1) an item popularity-aware weighting scheme for missing data, (2) a new algorithm for efficient learning and an incremental update strategy for real-time online learning, and (3) extensive experiments showing the method's superiority over existing approaches. The paper also discusses related work, including handling missing data, optimization techniques, and incremental learning strategies for online learning. The authors propose a fast eALS learning algorithm that avoids the expensive matrix inversion operation, leading to significant speed improvements. They also discuss the online update strategy, which allows the model to adapt to new interactions in real-time. The experiments show that eALS outperforms other methods in both offline and online settings, demonstrating its effectiveness and efficiency for implicit feedback learning.This paper presents improvements in both effectiveness and efficiency of Matrix Factorization (MF) methods for implicit feedback. The authors address two critical issues in existing works: (1) the uniform weighting of missing data, which is invalid in real-world settings, and (2) the offline nature of most methods, which fails to handle the dynamic nature of online data. To address these issues, the authors propose an item popularity-aware weighting scheme for missing data, which is more effective and flexible than the uniform-weight assumption. They also design a new learning algorithm based on the element-wise Alternating Least Squares (eALS) technique to efficiently optimize the MF model with variably-weighted missing data. This efficiency allows for an incremental update strategy that instantly refreshes the model given new feedback. Through experiments on two public datasets in both offline and online protocols, the authors show that their eALS method consistently outperforms state-of-the-art implicit MF methods. The implementation is available at https://github.com/hexiangnan/sigir16-eals. The key contributions include: (1) an item popularity-aware weighting scheme for missing data, (2) a new algorithm for efficient learning and an incremental update strategy for real-time online learning, and (3) extensive experiments showing the method's superiority over existing approaches. The paper also discusses related work, including handling missing data, optimization techniques, and incremental learning strategies for online learning. The authors propose a fast eALS learning algorithm that avoids the expensive matrix inversion operation, leading to significant speed improvements. They also discuss the online update strategy, which allows the model to adapt to new interactions in real-time. The experiments show that eALS outperforms other methods in both offline and online settings, demonstrating its effectiveness and efficiency for implicit feedback learning.