AUGUST 2019 | Zonghan Wu, Shirui Pan, Member, IEEE, Fengwen Chen, Guodong Long, Chengqi Zhang, Senior Member, IEEE, Philip S. Yu, Fellow, IEEE
A comprehensive survey on graph neural networks (GNNs) discusses the growing importance of GNNs in handling non-Euclidean data represented as graphs. The paper categorizes GNNs into four types: recurrent GNNs, convolutional GNNs, graph autoencoders, and spatial-temporal GNNs. It reviews various applications of GNNs across different domains and summarizes open-source codes, benchmark datasets, and model evaluations. The paper also outlines potential research directions in this rapidly evolving field.
The paper begins by highlighting the success of deep learning in tasks such as image classification and speech recognition, which rely on Euclidean data. However, many real-world applications involve non-Euclidean data represented as graphs, where nodes and edges form complex relationships. Traditional machine learning algorithms struggle with such data due to the irregular structure of graphs. To address this, researchers have developed GNNs that can process graph-structured data.
The paper discusses the differences between GNNs and network embedding, as well as between GNNs and graph kernel methods. It defines key concepts such as graphs, directed graphs, and spatial-temporal graphs. The paper then presents a taxonomy of GNNs, including recurrent GNNs, convolutional GNNs, graph autoencoders, and spatial-temporal GNNs. It provides an overview of each category, including their architectures, applications, and challenges.
The paper also discusses the training frameworks for GNNs, including semi-supervised, supervised, and unsupervised learning approaches. It summarizes the main characteristics of representative GNNs, comparing their input sources, pooling layers, readout layers, and time complexity. The paper highlights the importance of GNNs in various applications, such as node classification, edge prediction, and graph classification.
The paper concludes by discussing the future directions of GNN research, including model depth, scalability trade-offs, heterogeneity, and dynamicity. It emphasizes the need for further theoretical and practical advancements in GNNs to address the challenges posed by graph-structured data.A comprehensive survey on graph neural networks (GNNs) discusses the growing importance of GNNs in handling non-Euclidean data represented as graphs. The paper categorizes GNNs into four types: recurrent GNNs, convolutional GNNs, graph autoencoders, and spatial-temporal GNNs. It reviews various applications of GNNs across different domains and summarizes open-source codes, benchmark datasets, and model evaluations. The paper also outlines potential research directions in this rapidly evolving field.
The paper begins by highlighting the success of deep learning in tasks such as image classification and speech recognition, which rely on Euclidean data. However, many real-world applications involve non-Euclidean data represented as graphs, where nodes and edges form complex relationships. Traditional machine learning algorithms struggle with such data due to the irregular structure of graphs. To address this, researchers have developed GNNs that can process graph-structured data.
The paper discusses the differences between GNNs and network embedding, as well as between GNNs and graph kernel methods. It defines key concepts such as graphs, directed graphs, and spatial-temporal graphs. The paper then presents a taxonomy of GNNs, including recurrent GNNs, convolutional GNNs, graph autoencoders, and spatial-temporal GNNs. It provides an overview of each category, including their architectures, applications, and challenges.
The paper also discusses the training frameworks for GNNs, including semi-supervised, supervised, and unsupervised learning approaches. It summarizes the main characteristics of representative GNNs, comparing their input sources, pooling layers, readout layers, and time complexity. The paper highlights the importance of GNNs in various applications, such as node classification, edge prediction, and graph classification.
The paper concludes by discussing the future directions of GNN research, including model depth, scalability trade-offs, heterogeneity, and dynamicity. It emphasizes the need for further theoretical and practical advancements in GNNs to address the challenges posed by graph-structured data.