Weisfeiler and Leman Go Neural: Higher-order Graph Neural Networks

Weisfeiler and Leman Go Neural: Higher-order Graph Neural Networks

2019 | Christopher Morris, Martin Ritzert, Matthias Fey, William L. Hamilton, Jan Eric Lenssen, Gaurav Rattan, Martin Grohe
This paper presents a theoretical analysis of graph neural networks (GNNs) and their relationship to the 1-dimensional Weisfeiler-Leman (1-WL) graph isomorphism heuristic. The authors show that GNNs have the same expressive power as the 1-WL in distinguishing non-isomorphic (sub-)graphs. They propose a generalization of GNNs, called k-dimensional GNNs (k-GNNs), which can capture higher-order graph structures at multiple scales. These k-GNNs are more powerful than standard GNNs and can handle hierarchical graph structures. The authors also introduce a hierarchical variant of k-GNNs that combines graph representations learned at different granularities in an end-to-end trainable framework. Experimental results show that k-GNNs outperform traditional GNNs in graph classification and regression tasks. The study highlights the importance of higher-order graph information in these tasks and demonstrates that k-GNNs can effectively capture structural information that is not visible at the node level. The paper also discusses the limitations of both GNNs and kernel methods, showing that they share similar shortcomings in terms of adaptability to data distributions. Overall, the work provides a theoretical foundation for understanding the expressive power of GNNs and demonstrates their effectiveness in graph learning tasks.This paper presents a theoretical analysis of graph neural networks (GNNs) and their relationship to the 1-dimensional Weisfeiler-Leman (1-WL) graph isomorphism heuristic. The authors show that GNNs have the same expressive power as the 1-WL in distinguishing non-isomorphic (sub-)graphs. They propose a generalization of GNNs, called k-dimensional GNNs (k-GNNs), which can capture higher-order graph structures at multiple scales. These k-GNNs are more powerful than standard GNNs and can handle hierarchical graph structures. The authors also introduce a hierarchical variant of k-GNNs that combines graph representations learned at different granularities in an end-to-end trainable framework. Experimental results show that k-GNNs outperform traditional GNNs in graph classification and regression tasks. The study highlights the importance of higher-order graph information in these tasks and demonstrates that k-GNNs can effectively capture structural information that is not visible at the node level. The paper also discusses the limitations of both GNNs and kernel methods, showing that they share similar shortcomings in terms of adaptability to data distributions. Overall, the work provides a theoretical foundation for understanding the expressive power of GNNs and demonstrates their effectiveness in graph learning tasks.
Reach us at info@study.space