21 May 2014 | Joan Bruna, Wojciech Zaremba, Arthur Szlam, Yann LeCun
This paper proposes two constructions for deep neural networks on graphs: a spatial construction and a spectral construction. The spatial construction generalizes convolutional networks to graphs by using hierarchical clustering and locally connected layers, which reduce the number of parameters to O(n), where n is the input size. The spectral construction uses the graph Laplacian to define convolutions in the Fourier domain, allowing for efficient forward propagation and parameter independence from input size. Both constructions enable efficient deep architectures on graphs, particularly for low-dimensional graphs. The spatial construction uses multiscale clustering and local receptive fields, while the spectral construction leverages the eigenvalues of the graph Laplacian to define filters. The paper validates these constructions on low-dimensional graph datasets and shows that they can outperform traditional fully connected networks. The spectral construction, in particular, benefits from smoothness constraints on the filter spectrum, leading to better spatial localization. The paper also discusses the application of these methods to real-world data, such as MNIST on a sphere, where they demonstrate improved performance over traditional methods. The results suggest that graph-based convolutional architectures can significantly reduce the number of parameters in neural networks while maintaining or improving performance. The paper concludes that further research is needed to apply these methods to more complex and non-grid structured data.This paper proposes two constructions for deep neural networks on graphs: a spatial construction and a spectral construction. The spatial construction generalizes convolutional networks to graphs by using hierarchical clustering and locally connected layers, which reduce the number of parameters to O(n), where n is the input size. The spectral construction uses the graph Laplacian to define convolutions in the Fourier domain, allowing for efficient forward propagation and parameter independence from input size. Both constructions enable efficient deep architectures on graphs, particularly for low-dimensional graphs. The spatial construction uses multiscale clustering and local receptive fields, while the spectral construction leverages the eigenvalues of the graph Laplacian to define filters. The paper validates these constructions on low-dimensional graph datasets and shows that they can outperform traditional fully connected networks. The spectral construction, in particular, benefits from smoothness constraints on the filter spectrum, leading to better spatial localization. The paper also discusses the application of these methods to real-world data, such as MNIST on a sphere, where they demonstrate improved performance over traditional methods. The results suggest that graph-based convolutional architectures can significantly reduce the number of parameters in neural networks while maintaining or improving performance. The paper concludes that further research is needed to apply these methods to more complex and non-grid structured data.