Temporal Graph Contrastive Learning for Sequential Recommendation

Temporal Graph Contrastive Learning for Sequential Recommendation

2024 | Shengzhe Zhang, Liyi Chen, Chao Wang, Shuangli Li, Hui Xiong
The paper introduces a Temporal Graph Contrastive Learning method for Sequential Recommendation (TGCL4SR), which leverages both local interaction sequences and global temporal graphs to understand item correlations and analyze user behaviors. The key contributions are: 1. **Temporal Item Transition Graph (TTTG)**: This graph integrates global interactions to capture item correlations, with edges between items from adjacent interactions in a sequence, and attributes including timestamps and users. 2. **Dual Transformations**: To handle data sparsity and noise, the TTTG is augmented using neighbor sampling and time disturbance, reducing graph scale and improving robustness. 3. **Temporal Item Transition Graph Convolutional Network (TITConv)**: This network captures item transition patterns in the augmented TTTG by aggregating features of neighboring items with absolute time and users. 4. **Temporal Graph Contrastive Learning (TGCL)**: This mechanism enhances representation uniformity by comparing augmented graphs from identical sequences, using subgraph and disturbed contrastive losses. 5. **Temporal Sequence Encoder**: This encoder incorporates time interval embeddings into the Transformer architecture to capture evolving user interests. 6. **Training Loss**: The model is trained using cross-entropy, contrastive losses, and Maximum Mean Discrepancy (MMD) to align item representations from global graphs and local sequences. Experiments on four real-world datasets show that TGCL4SR outperforms state-of-the-art baselines, demonstrating its effectiveness in sequential recommendation tasks.The paper introduces a Temporal Graph Contrastive Learning method for Sequential Recommendation (TGCL4SR), which leverages both local interaction sequences and global temporal graphs to understand item correlations and analyze user behaviors. The key contributions are: 1. **Temporal Item Transition Graph (TTTG)**: This graph integrates global interactions to capture item correlations, with edges between items from adjacent interactions in a sequence, and attributes including timestamps and users. 2. **Dual Transformations**: To handle data sparsity and noise, the TTTG is augmented using neighbor sampling and time disturbance, reducing graph scale and improving robustness. 3. **Temporal Item Transition Graph Convolutional Network (TITConv)**: This network captures item transition patterns in the augmented TTTG by aggregating features of neighboring items with absolute time and users. 4. **Temporal Graph Contrastive Learning (TGCL)**: This mechanism enhances representation uniformity by comparing augmented graphs from identical sequences, using subgraph and disturbed contrastive losses. 5. **Temporal Sequence Encoder**: This encoder incorporates time interval embeddings into the Transformer architecture to capture evolving user interests. 6. **Training Loss**: The model is trained using cross-entropy, contrastive losses, and Maximum Mean Discrepancy (MMD) to align item representations from global graphs and local sequences. Experiments on four real-world datasets show that TGCL4SR outperforms state-of-the-art baselines, demonstrating its effectiveness in sequential recommendation tasks.
Reach us at info@study.space
Understanding Temporal Graph Contrastive Learning for Sequential Recommendation