**DropEdge: Towards Deep Graph Convolutional Networks on Node Classification**
**Abstract:**
This paper addresses the challenges of over-fitting and over-smoothing in deep Graph Convolutional Networks (GCNs) for node classification. Over-fitting weakens generalization on small datasets, while over-smoothing hinders training by isolating output representations from input features as the network depth increases. DropEdge is introduced as a novel technique to alleviate these issues. It randomly removes a certain number of edges from the input graph at each training epoch, acting as both a data augmenter and a message passing reducer. Theoretical analysis shows that DropEdge either reduces the convergence speed of over-smoothing or relieves information loss caused by it. DropEdge is compatible with various backbone models, including GCN, ResGCN, GraphSAGE, and JKNet, and significantly improves their performance on multiple benchmarks. Extensive experiments validate DropEdge's effectiveness in preventing over-smoothing and enhancing the performance of deep GCNs.
**Introduction:**
Graph Convolutional Networks (GCNs) have become crucial tools for graph representation learning, but they face challenges in deep architectures. Over-fitting and over-smoothing are two main obstacles. Over-fitting occurs when deep GCNs fit training data too well but generalize poorly to unseen data, especially on small datasets. Over-smoothing makes training difficult by isolating output representations from input features, leading to vanishing gradients and poor performance. DropEdge addresses these issues by randomly dropping edges, generating diverse input data and reducing message passing, thereby preventing over-fitting and over-smoothing.
**Related Work:**
The paper reviews existing methods for GCNs, including spectral-based and spatial-based approaches, and discusses deep GCNs and their challenges. It compares DropEdge with other techniques like Dropout, DropNode, and Graph Sparsification, highlighting its unique advantages.
**Methodology:**
DropEdge randomly drops edges from the input graph, perturbing the adjacency matrix. This process acts as a data augmentation technique and a message passing reducer. The paper provides theoretical foundations for DropEdge's effectiveness in preventing over-fitting and over-smoothing.
**Experiments:**
The paper evaluates DropEdge on several benchmark datasets, showing significant improvements in accuracy and convergence compared to models without DropEdge. It also compares DropEdge with state-of-the-art methods, demonstrating its effectiveness in enhancing deep GCNs.
**Conclusion:**
DropEdge is a novel and efficient technique that facilitates the development of deep GCNs by preventing over-fitting and over-smoothing. Extensive experiments validate its effectiveness, opening new avenues for deeper exploration of GCNs in various applications.**DropEdge: Towards Deep Graph Convolutional Networks on Node Classification**
**Abstract:**
This paper addresses the challenges of over-fitting and over-smoothing in deep Graph Convolutional Networks (GCNs) for node classification. Over-fitting weakens generalization on small datasets, while over-smoothing hinders training by isolating output representations from input features as the network depth increases. DropEdge is introduced as a novel technique to alleviate these issues. It randomly removes a certain number of edges from the input graph at each training epoch, acting as both a data augmenter and a message passing reducer. Theoretical analysis shows that DropEdge either reduces the convergence speed of over-smoothing or relieves information loss caused by it. DropEdge is compatible with various backbone models, including GCN, ResGCN, GraphSAGE, and JKNet, and significantly improves their performance on multiple benchmarks. Extensive experiments validate DropEdge's effectiveness in preventing over-smoothing and enhancing the performance of deep GCNs.
**Introduction:**
Graph Convolutional Networks (GCNs) have become crucial tools for graph representation learning, but they face challenges in deep architectures. Over-fitting and over-smoothing are two main obstacles. Over-fitting occurs when deep GCNs fit training data too well but generalize poorly to unseen data, especially on small datasets. Over-smoothing makes training difficult by isolating output representations from input features, leading to vanishing gradients and poor performance. DropEdge addresses these issues by randomly dropping edges, generating diverse input data and reducing message passing, thereby preventing over-fitting and over-smoothing.
**Related Work:**
The paper reviews existing methods for GCNs, including spectral-based and spatial-based approaches, and discusses deep GCNs and their challenges. It compares DropEdge with other techniques like Dropout, DropNode, and Graph Sparsification, highlighting its unique advantages.
**Methodology:**
DropEdge randomly drops edges from the input graph, perturbing the adjacency matrix. This process acts as a data augmentation technique and a message passing reducer. The paper provides theoretical foundations for DropEdge's effectiveness in preventing over-fitting and over-smoothing.
**Experiments:**
The paper evaluates DropEdge on several benchmark datasets, showing significant improvements in accuracy and convergence compared to models without DropEdge. It also compares DropEdge with state-of-the-art methods, demonstrating its effectiveness in enhancing deep GCNs.
**Conclusion:**
DropEdge is a novel and efficient technique that facilitates the development of deep GCNs by preventing over-fitting and over-smoothing. Extensive experiments validate its effectiveness, opening new avenues for deeper exploration of GCNs in various applications.