Heterogeneous Graph Transformer

Heterogeneous Graph Transformer

3 Mar 2020 | Ziniu Hu, Yuxiao Dong, Kuansan Wang, Yizhou Sun
The Heterogeneous Graph Transformer (HGT) is a novel architecture designed to model large-scale heterogeneous graphs. Unlike traditional GNNs that assume homogeneity, HGT accounts for the diversity of node and edge types by using meta relations to parameterize attention mechanisms. This allows HGT to maintain distinct representations for different node and edge types, enabling it to capture the complex structure of heterogeneous graphs. To handle dynamic graphs, HGT introduces relative temporal encoding (RTE), which captures temporal dependencies without requiring fixed timestamps. Additionally, HGT employs a heterogeneous mini-batch graph sampling algorithm (HGSampling) to efficiently train on large-scale graphs. Experiments on the Open Academic Graph, which contains 179 million nodes and 2 billion edges, show that HGT outperforms state-of-the-art GNN baselines by 9–21% across various downstream tasks. The model's ability to automatically learn implicit meta paths and handle dynamic graph structures makes it effective for large-scale heterogeneous graph learning.The Heterogeneous Graph Transformer (HGT) is a novel architecture designed to model large-scale heterogeneous graphs. Unlike traditional GNNs that assume homogeneity, HGT accounts for the diversity of node and edge types by using meta relations to parameterize attention mechanisms. This allows HGT to maintain distinct representations for different node and edge types, enabling it to capture the complex structure of heterogeneous graphs. To handle dynamic graphs, HGT introduces relative temporal encoding (RTE), which captures temporal dependencies without requiring fixed timestamps. Additionally, HGT employs a heterogeneous mini-batch graph sampling algorithm (HGSampling) to efficiently train on large-scale graphs. Experiments on the Open Academic Graph, which contains 179 million nodes and 2 billion edges, show that HGT outperforms state-of-the-art GNN baselines by 9–21% across various downstream tasks. The model's ability to automatically learn implicit meta paths and handle dynamic graph structures makes it effective for large-scale heterogeneous graph learning.
Reach us at info@study.space
Understanding Heterogeneous Graph Transformer