SEMI-SUPERVISED CLASSIFICATION WITH GRAPH CONVOLUTIONAL NETWORKS

SEMI-SUPERVISED CLASSIFICATION WITH GRAPH CONVOLUTIONAL NETWORKS

22 Feb 2017 | Thomas N. Kipf, Max Welling
This paper presents a scalable approach for semi-supervised learning on graph-structured data using graph convolutional networks (GCNs). The authors propose a model that operates directly on graphs and uses a localized first-order approximation of spectral graph convolutions to motivate its architecture. The model scales linearly with the number of graph edges and learns hidden layer representations that encode both local graph structure and node features. The model is tested on citation networks and a knowledge graph dataset, where it outperforms related methods significantly. The paper introduces a simple and well-behaved layer-wise propagation rule for neural networks operating on graphs, which is motivated by a first-order approximation of spectral graph convolutions. The model is shown to be effective for fast and scalable semi-supervised classification of nodes in a graph. Experiments on various datasets demonstrate that the model compares favorably in classification accuracy and efficiency (measured in wall-clock time) against state-of-the-art methods for semi-supervised learning. The authors provide theoretical motivation for a specific graph-based neural network model, which is based on a multi-layer GCN with a specific layer-wise propagation rule. They show that this propagation rule can be motivated via a first-order approximation of localized spectral filters on graphs. The model is then applied to semi-supervised node classification, where it is shown to outperform related methods in several scenarios. The paper also discusses related work in graph-based semi-supervised learning and neural networks on graphs. It compares the proposed model against several baseline methods, including label propagation, semi-supervised embedding, manifold regularization, and skip-gram based graph embeddings. The results show that the proposed model outperforms these methods in terms of classification accuracy and efficiency. The authors conduct experiments on several datasets, including citation networks and a knowledge graph dataset. They show that the model performs well on these datasets, achieving high classification accuracy and efficient training. The model is also evaluated on random graphs, where it is shown to be efficient in terms of training time. The paper concludes that the proposed GCN model is effective for semi-supervised classification on graph-structured data. The model uses an efficient layer-wise propagation rule based on a first-order approximation of spectral convolutions on graphs. Experiments on various network datasets suggest that the proposed GCN model is capable of encoding both graph structure and node features in a way useful for semi-supervised classification. In this setting, the model outperforms several recently proposed methods by a significant margin, while being computationally efficient.This paper presents a scalable approach for semi-supervised learning on graph-structured data using graph convolutional networks (GCNs). The authors propose a model that operates directly on graphs and uses a localized first-order approximation of spectral graph convolutions to motivate its architecture. The model scales linearly with the number of graph edges and learns hidden layer representations that encode both local graph structure and node features. The model is tested on citation networks and a knowledge graph dataset, where it outperforms related methods significantly. The paper introduces a simple and well-behaved layer-wise propagation rule for neural networks operating on graphs, which is motivated by a first-order approximation of spectral graph convolutions. The model is shown to be effective for fast and scalable semi-supervised classification of nodes in a graph. Experiments on various datasets demonstrate that the model compares favorably in classification accuracy and efficiency (measured in wall-clock time) against state-of-the-art methods for semi-supervised learning. The authors provide theoretical motivation for a specific graph-based neural network model, which is based on a multi-layer GCN with a specific layer-wise propagation rule. They show that this propagation rule can be motivated via a first-order approximation of localized spectral filters on graphs. The model is then applied to semi-supervised node classification, where it is shown to outperform related methods in several scenarios. The paper also discusses related work in graph-based semi-supervised learning and neural networks on graphs. It compares the proposed model against several baseline methods, including label propagation, semi-supervised embedding, manifold regularization, and skip-gram based graph embeddings. The results show that the proposed model outperforms these methods in terms of classification accuracy and efficiency. The authors conduct experiments on several datasets, including citation networks and a knowledge graph dataset. They show that the model performs well on these datasets, achieving high classification accuracy and efficient training. The model is also evaluated on random graphs, where it is shown to be efficient in terms of training time. The paper concludes that the proposed GCN model is effective for semi-supervised classification on graph-structured data. The model uses an efficient layer-wise propagation rule based on a first-order approximation of spectral convolutions on graphs. Experiments on various network datasets suggest that the proposed GCN model is capable of encoding both graph structure and node features in a way useful for semi-supervised classification. In this setting, the model outperforms several recently proposed methods by a significant margin, while being computationally efficient.
Reach us at info@study.space