2024 | Yonghua Zhu1,2, Lei Feng3, Zhenyun Deng4, Yang Chen1,2, Robert Amor2, Michael Witbrock1,2
This paper addresses the challenge of robust node classification in graph data, which is often contaminated by both graph noise and label noise. The authors propose a novel method called Robust Node Classification under Graph and Label Noise (RNCGLN) to simultaneously handle these issues. RNCGLN employs a graph contrastive loss for local graph learning and multi-head self-attention for global graph learning, enhancing the expressiveness of node representations. The method also uses pseudo graphs and pseudo labels to address graph and label noise, respectively. Extensive experiments on four datasets demonstrate that RNCGLN outperforms existing methods in node classification, showing superior robustness to both graph and label noise. The effectiveness of the proposed method is validated through numerical validation and ablation studies, highlighting its ability to improve performance under various noise scenarios.This paper addresses the challenge of robust node classification in graph data, which is often contaminated by both graph noise and label noise. The authors propose a novel method called Robust Node Classification under Graph and Label Noise (RNCGLN) to simultaneously handle these issues. RNCGLN employs a graph contrastive loss for local graph learning and multi-head self-attention for global graph learning, enhancing the expressiveness of node representations. The method also uses pseudo graphs and pseudo labels to address graph and label noise, respectively. Extensive experiments on four datasets demonstrate that RNCGLN outperforms existing methods in node classification, showing superior robustness to both graph and label noise. The effectiveness of the proposed method is validated through numerical validation and ablation studies, highlighting its ability to improve performance under various noise scenarios.