GMAN: A Graph Multi-Attention Network for Traffic Prediction

GMAN: A Graph Multi-Attention Network for Traffic Prediction

26 Nov 2019 | Chuanpan Zheng, Xiaoliang Fan, Cheng Wang, Jianzhong Qi
The paper introduces the Graph Multi-Attention Network (GMAN), a novel approach for long-term traffic prediction on road network graphs. GMAN addresses the challenges of complex spatio-temporal correlations and error propagation by incorporating spatial and temporal attention mechanisms, gated fusion, and a transform attention layer. The encoder-decoder architecture of GMAN encodes historical traffic features and predicts future traffic conditions, with the transform attention layer converting encoded features to generate future representations. Experimental results on two real-world datasets (Xiamen and PeMS) demonstrate that GMAN outperforms state-of-the-art methods, achieving up to 4% improvement in Mean Absolute Error (MAE) for 1-hour ahead predictions. The model also shows superior fault-tolerance and computational efficiency.The paper introduces the Graph Multi-Attention Network (GMAN), a novel approach for long-term traffic prediction on road network graphs. GMAN addresses the challenges of complex spatio-temporal correlations and error propagation by incorporating spatial and temporal attention mechanisms, gated fusion, and a transform attention layer. The encoder-decoder architecture of GMAN encodes historical traffic features and predicts future traffic conditions, with the transform attention layer converting encoded features to generate future representations. Experimental results on two real-world datasets (Xiamen and PeMS) demonstrate that GMAN outperforms state-of-the-art methods, achieving up to 4% improvement in Mean Absolute Error (MAE) for 1-hour ahead predictions. The model also shows superior fault-tolerance and computational efficiency.
Reach us at info@study.space
Understanding GMAN%3A A Graph Multi-Attention Network for Traffic Prediction