On the Feasibility of Simple Transformer for Dynamic Graph Modeling

On the Feasibility of Simple Transformer for Dynamic Graph Modeling

May 13–17, 2024, Singapore, Singapore | Yuxia Wu, Yuan Fang*, Lizi Liao
Dynamic graph modeling is crucial for understanding complex structures in web graphs, spanning applications in social networks, recommender systems, and more. Existing methods often overlook detailed temporal aspects or struggle with long-term dependencies. This work leverages the Transformer's self-attention mechanism to address these limitations. The proposed SimpleDyG model re-conceptualizes dynamic graphs as sequence modeling challenges and introduces a novel temporal alignment technique. This technique captures temporal evolution patterns and streamlines the modeling process. Extensive experiments on four real-world datasets demonstrate the competitive performance of SimpleDyG compared to state-of-the-art approaches, despite its simple design. Dynamic graph modeling, Transformer, graph representation learning Yuxia Wu, Yuan Fang, Lizi Liao. On the Feasibility of Simple Transformer for Dynamic Graph Modeling. In *Proceedings of the ACM Web Conference 2024 (WWW '24), May 13–17, 2024, Singapore, Singapore*. ACM, New York, NY, USA, 11 pages. https://doi.org/10.1145/3589334.3645622 Dynamic graph modeling is crucial for understanding complex structures in web graphs, spanning applications in social networks, recommender systems, and more. Existing methods often overlook detailed temporal aspects or struggle with long-term dependencies. This work leverages the Transformer's self-attention mechanism to address these limitations. The proposed SimpleDyG model re-conceptualizes dynamic graphs as sequence modeling challenges and introduces a novel temporal alignment technique. This technique captures temporal evolution patterns and streamlines the modeling process. Extensive experiments on four real-world datasets demonstrate the competitive performance of SimpleDyG compared to state-of-the-art approaches, despite its simple design.Dynamic graph modeling is crucial for understanding complex structures in web graphs, spanning applications in social networks, recommender systems, and more. Existing methods often overlook detailed temporal aspects or struggle with long-term dependencies. This work leverages the Transformer's self-attention mechanism to address these limitations. The proposed SimpleDyG model re-conceptualizes dynamic graphs as sequence modeling challenges and introduces a novel temporal alignment technique. This technique captures temporal evolution patterns and streamlines the modeling process. Extensive experiments on four real-world datasets demonstrate the competitive performance of SimpleDyG compared to state-of-the-art approaches, despite its simple design. Dynamic graph modeling, Transformer, graph representation learning Yuxia Wu, Yuan Fang, Lizi Liao. On the Feasibility of Simple Transformer for Dynamic Graph Modeling. In *Proceedings of the ACM Web Conference 2024 (WWW '24), May 13–17, 2024, Singapore, Singapore*. ACM, New York, NY, USA, 11 pages. https://doi.org/10.1145/3589334.3645622 Dynamic graph modeling is crucial for understanding complex structures in web graphs, spanning applications in social networks, recommender systems, and more. Existing methods often overlook detailed temporal aspects or struggle with long-term dependencies. This work leverages the Transformer's self-attention mechanism to address these limitations. The proposed SimpleDyG model re-conceptualizes dynamic graphs as sequence modeling challenges and introduces a novel temporal alignment technique. This technique captures temporal evolution patterns and streamlines the modeling process. Extensive experiments on four real-world datasets demonstrate the competitive performance of SimpleDyG compared to state-of-the-art approaches, despite its simple design.
Reach us at info@study.space