2017 | Junfeng Fang, Xinglin Li, Yongduo Sui, Yuan Gao, Guibin Zhang, Kun Wang, Xiang Wang, Xiangnan He
EXGC: Bridging Efficiency and Explainability in Graph Condensation
This paper proposes EXGC, an efficient and explainable graph condensation method that addresses two major inefficiencies in current graph condensation (GCond) approaches: (1) the concurrent updating of a vast parameter set and (2) pronounced parameter redundancy. To improve efficiency, EXGC employs the Mean-Field variational approximation to accelerate convergence. To reduce redundancy, it introduces the Gradient Information Bottleneck (GDIB) objective, which selects informative nodes for training. By integrating leading explanation techniques such as GNNExplainer and GSAT, EXGC achieves both efficiency and explainability. The method is evaluated on eight datasets, demonstrating significant improvements in efficiency and performance. EXGC outperforms existing methods in terms of speed and accuracy, and its results show that it can effectively reduce graph size while maintaining high performance. The method is also shown to be transferable to other graph condensation frameworks, such as DosGCond, and is effective across various graph architectures. The paper highlights the importance of explainability in graph learning and provides a practical solution for efficient and interpretable graph condensation.EXGC: Bridging Efficiency and Explainability in Graph Condensation
This paper proposes EXGC, an efficient and explainable graph condensation method that addresses two major inefficiencies in current graph condensation (GCond) approaches: (1) the concurrent updating of a vast parameter set and (2) pronounced parameter redundancy. To improve efficiency, EXGC employs the Mean-Field variational approximation to accelerate convergence. To reduce redundancy, it introduces the Gradient Information Bottleneck (GDIB) objective, which selects informative nodes for training. By integrating leading explanation techniques such as GNNExplainer and GSAT, EXGC achieves both efficiency and explainability. The method is evaluated on eight datasets, demonstrating significant improvements in efficiency and performance. EXGC outperforms existing methods in terms of speed and accuracy, and its results show that it can effectively reduce graph size while maintaining high performance. The method is also shown to be transferable to other graph condensation frameworks, such as DosGCond, and is effective across various graph architectures. The paper highlights the importance of explainability in graph learning and provides a practical solution for efficient and interpretable graph condensation.