5 Feb 2024 | Junfeng Fang, Xinglin Li, Yongduo Sui, Yuan Gao, Guibin Zhang, Kun Wang, Xiang Wang, Xiangnan He
The article presents EXGC, an efficient and explainable graph condensation method designed to address the inefficiencies in current graph condensation techniques. Graph condensation (GCond) aims to compress large real graphs into smaller, synthetic graphs while preserving their information content. However, existing methods face challenges in efficiency, particularly when dealing with large-scale web data graphs. EXGC addresses these issues by employing the Mean-Field (MF) variational approximation to accelerate convergence and introduces the Gradient Information Bottleneck (GDB) to prune redundant parameters. By integrating leading explanation techniques like GNNExplain and GSAT, EXGC enhances both efficiency and explainability. The method is evaluated on eight datasets, demonstrating significant improvements in speed and performance. EXGC not only boosts efficiency but also provides insights into the training process, making it a valuable tool for graph data analysis. The results show that EXGC outperforms existing methods, highlighting its effectiveness in reducing computational overhead while maintaining high accuracy. The approach also supports the transferability of the condensed graph across different architectures, underscoring its broad applicability in graph neural network applications.The article presents EXGC, an efficient and explainable graph condensation method designed to address the inefficiencies in current graph condensation techniques. Graph condensation (GCond) aims to compress large real graphs into smaller, synthetic graphs while preserving their information content. However, existing methods face challenges in efficiency, particularly when dealing with large-scale web data graphs. EXGC addresses these issues by employing the Mean-Field (MF) variational approximation to accelerate convergence and introduces the Gradient Information Bottleneck (GDB) to prune redundant parameters. By integrating leading explanation techniques like GNNExplain and GSAT, EXGC enhances both efficiency and explainability. The method is evaluated on eight datasets, demonstrating significant improvements in speed and performance. EXGC not only boosts efficiency but also provides insights into the training process, making it a valuable tool for graph data analysis. The results show that EXGC outperforms existing methods, highlighting its effectiveness in reducing computational overhead while maintaining high accuracy. The approach also supports the transferability of the condensed graph across different architectures, underscoring its broad applicability in graph neural network applications.