GTC: GNN-Transformer Co-contrastive Learning for Self-supervised Heterogeneous Graph Representation

GTC: GNN-Transformer Co-contrastive Learning for Self-supervised Heterogeneous Graph Representation

22 Mar 2024 | Yundong Sun, Dongjie Zhu, yansong Wang, ZhaoShuo Tian, Member, IEEE
GTC: GNN-Transformer Co-contrastive Learning for Self-supervised Heterogeneous Graph Representation This paper proposes a novel framework, GTC, which combines GNN and Transformer to address the over-smoothing problem in graph neural networks (GNNs) and enable self-supervised heterogeneous graph representation. GTC leverages GNN and Transformer branches to encode node information from different views, and establishes contrastive learning tasks based on the encoded cross-view information to realize self-supervised heterogeneous graph representation. For the Transformer branch, we propose Metapath-aware Hop2Token and CG-Hetphormer, which can cooperate with GNN to attentively encode neighborhood information from different levels. The experiments on real datasets show that GTC exhibits superior performance compared with state-of-the-art methods. The over-smoothing problem in GNNs is a major challenge in deep graph learning, as it limits the ability of GNNs to capture multi-hop neighbors. Transformers, on the other hand, can model global information and multi-hop interactions via multi-head self-attention. GTC combines the strengths of GNNs and Transformers to overcome the over-smoothing problem and capture multi-hop neighbor information. The proposed GTC architecture leverages GNN and Transformer branches to encode graph schema view and hops view information respectively, and establishes contrastive learning tasks based on the encoded cross-view information to realize self-supervised heterogeneous graph representation. The GTC framework is designed to address the over-smoothing problem in GNNs by leveraging the global modeling ability of Transformers. The proposed Metapath-aware Hop2Token and CG-Hetphormer models can effectively encode neighborhood information from different levels and collaborate with GNN to achieve efficient encoding. The experiments on real datasets show that GTC outperforms existing state-of-the-art methods in node classification, clustering, and visualization tasks. The results demonstrate that GTC can effectively capture multi-hop neighbor information without interference from the over-smoothing problem, providing a reference for future research on solving the over-smoothing problem of GNNs.GTC: GNN-Transformer Co-contrastive Learning for Self-supervised Heterogeneous Graph Representation This paper proposes a novel framework, GTC, which combines GNN and Transformer to address the over-smoothing problem in graph neural networks (GNNs) and enable self-supervised heterogeneous graph representation. GTC leverages GNN and Transformer branches to encode node information from different views, and establishes contrastive learning tasks based on the encoded cross-view information to realize self-supervised heterogeneous graph representation. For the Transformer branch, we propose Metapath-aware Hop2Token and CG-Hetphormer, which can cooperate with GNN to attentively encode neighborhood information from different levels. The experiments on real datasets show that GTC exhibits superior performance compared with state-of-the-art methods. The over-smoothing problem in GNNs is a major challenge in deep graph learning, as it limits the ability of GNNs to capture multi-hop neighbors. Transformers, on the other hand, can model global information and multi-hop interactions via multi-head self-attention. GTC combines the strengths of GNNs and Transformers to overcome the over-smoothing problem and capture multi-hop neighbor information. The proposed GTC architecture leverages GNN and Transformer branches to encode graph schema view and hops view information respectively, and establishes contrastive learning tasks based on the encoded cross-view information to realize self-supervised heterogeneous graph representation. The GTC framework is designed to address the over-smoothing problem in GNNs by leveraging the global modeling ability of Transformers. The proposed Metapath-aware Hop2Token and CG-Hetphormer models can effectively encode neighborhood information from different levels and collaborate with GNN to achieve efficient encoding. The experiments on real datasets show that GTC outperforms existing state-of-the-art methods in node classification, clustering, and visualization tasks. The results demonstrate that GTC can effectively capture multi-hop neighbor information without interference from the over-smoothing problem, providing a reference for future research on solving the over-smoothing problem of GNNs.
Reach us at info@study.space
[slides and audio] GTC%3A GNN-Transformer Co-contrastive Learning for Self-supervised Heterogeneous Graph Representation