19 Jun 2019 | Sami Abu-El-Haija, Bryan Perozzi, Amol Kapoor, Nazanin Alipourfard, Kristina Lerman, Hrayr Harutyunyan, Greg Ver Steeg, Aram Galstyan
The paper introduces MixHop, a novel graph convolutional architecture designed to learn higher-order neighborhood mixing relationships, which are crucial for semi-supervised learning tasks. Traditional graph convolutional networks (GCNs) like the Graph Convolutional Network (GCN) cannot learn these relationships, particularly difference operators, which are essential for tasks such as learning Gabor-like filters on graphs. MixHop addresses this limitation by allowing the model to mix feature representations of neighbors at various distances, without increasing computational complexity or memory requirements.
Key contributions of the paper include:
1. **Formalizing Delta Operators and Neighborhood Mixing**: The authors define Delta Operators and generalize them to Neighborhood Mixing, showing that popular GCNs cannot learn these representations.
2. **Proposing MixHop**: A new GC layer that mixes powers of the adjacency matrix, enabling the model to learn a wider class of representations.
3. **Learning to Divide Modeling Capacity**: The method allows for learning how to divide modeling capacity among different widths and depths of the MixHop model, leading to powerful and compact GCN architectures.
The paper also introduces sparsity regularization to visualize how the network prioritizes neighborhood information across different graph datasets. Experimental results demonstrate that MixHop outperforms existing baselines on challenging semi-supervised node classification tasks, particularly in graphs with low homophily (low correlation between edges and labels). The learned architectures vary across different datasets, confirming that optimal architectures differ for each graph dataset.The paper introduces MixHop, a novel graph convolutional architecture designed to learn higher-order neighborhood mixing relationships, which are crucial for semi-supervised learning tasks. Traditional graph convolutional networks (GCNs) like the Graph Convolutional Network (GCN) cannot learn these relationships, particularly difference operators, which are essential for tasks such as learning Gabor-like filters on graphs. MixHop addresses this limitation by allowing the model to mix feature representations of neighbors at various distances, without increasing computational complexity or memory requirements.
Key contributions of the paper include:
1. **Formalizing Delta Operators and Neighborhood Mixing**: The authors define Delta Operators and generalize them to Neighborhood Mixing, showing that popular GCNs cannot learn these representations.
2. **Proposing MixHop**: A new GC layer that mixes powers of the adjacency matrix, enabling the model to learn a wider class of representations.
3. **Learning to Divide Modeling Capacity**: The method allows for learning how to divide modeling capacity among different widths and depths of the MixHop model, leading to powerful and compact GCN architectures.
The paper also introduces sparsity regularization to visualize how the network prioritizes neighborhood information across different graph datasets. Experimental results demonstrate that MixHop outperforms existing baselines on challenging semi-supervised node classification tasks, particularly in graphs with low homophily (low correlation between edges and labels). The learned architectures vary across different datasets, confirming that optimal architectures differ for each graph dataset.