Graph neural networks: A review of methods and applications

Graph neural networks: A review of methods and applications

2020 | Jie Zhou, Ganqu Cui, Shengding Hu, Zhengyan Zhang, Cheng Yang, Zhiyuan Liu, Lifeng Wang, Changcheng Li, Maosong Sun
This paper provides a comprehensive review of graph neural networks (GNNs), covering their design pipeline, computational modules, and applications. GNNs are neural models that capture dependencies in graph data through message passing between nodes. The authors discuss the evolution of GNNs, from early recursive and recurrent neural networks to modern variants like graph convolutional networks (GCNs), graph attention networks (GATs), and graph recurrent networks (GRNs). They present a general design pipeline for GNN models, including finding graph structure, specifying graph type and scale, designing loss functions, and building models using computational modules. The paper also categorizes GNN applications into structural and non-structural scenarios and proposes four open problems for future research. The review covers various computational modules such as convolution operators, sampling modules, and pooling modules, and discusses their instantiations and variants. Additionally, it explores attention-based spatial approaches and general frameworks for spatial approaches, such as the mixture model network (MoNet), message passing neural network (MPNN), non-local neural network (NLNN), and graph network (GN). The paper highlights the limitations of vanilla GNNs and introduces methods to address these issues, including convergence-based and gate-based approaches. Finally, it discusses the use of skip connections to improve the performance of deeper GNN models.This paper provides a comprehensive review of graph neural networks (GNNs), covering their design pipeline, computational modules, and applications. GNNs are neural models that capture dependencies in graph data through message passing between nodes. The authors discuss the evolution of GNNs, from early recursive and recurrent neural networks to modern variants like graph convolutional networks (GCNs), graph attention networks (GATs), and graph recurrent networks (GRNs). They present a general design pipeline for GNN models, including finding graph structure, specifying graph type and scale, designing loss functions, and building models using computational modules. The paper also categorizes GNN applications into structural and non-structural scenarios and proposes four open problems for future research. The review covers various computational modules such as convolution operators, sampling modules, and pooling modules, and discusses their instantiations and variants. Additionally, it explores attention-based spatial approaches and general frameworks for spatial approaches, such as the mixture model network (MoNet), message passing neural network (MPNN), non-local neural network (NLNN), and graph network (GN). The paper highlights the limitations of vanilla GNNs and introduces methods to address these issues, including convergence-based and gate-based approaches. Finally, it discusses the use of skip connections to improve the performance of deeper GNN models.
Reach us at info@study.space