20 Jun 2019 | Felix Wu * 1 Tianyi Zhang * 1 Amauri Holanda de Souza Jr. * 1 2 Christopher Fifty 1 Tao Yu 1 Kilian Q. Weinberger 1
This paper aims to simplify Graph Convolutional Networks (GCNs) by removing nonlinearities and collapsing weight matrices between layers, resulting in a linear model. The simplified model, referred to as Simple Graph Convolution (SGC), is theoretically analyzed and shown to correspond to a fixed low-pass filter followed by a linear classifier. Experimental results demonstrate that SGC maintains or improves accuracy compared to GCNs on various tasks while being computationally more efficient and interpretable. SGC scales well to large datasets and outperforms FastGCN by up to two orders of magnitude on the Reddit dataset. The paper also provides a spectral analysis of SGC, showing that it acts as a low-pass filter, which explains the effectiveness of the "renormalization trick" in GCNs.This paper aims to simplify Graph Convolutional Networks (GCNs) by removing nonlinearities and collapsing weight matrices between layers, resulting in a linear model. The simplified model, referred to as Simple Graph Convolution (SGC), is theoretically analyzed and shown to correspond to a fixed low-pass filter followed by a linear classifier. Experimental results demonstrate that SGC maintains or improves accuracy compared to GCNs on various tasks while being computationally more efficient and interpretable. SGC scales well to large datasets and outperforms FastGCN by up to two orders of magnitude on the Reddit dataset. The paper also provides a spectral analysis of SGC, showing that it acts as a low-pass filter, which explains the effectiveness of the "renormalization trick" in GCNs.