HOW ATTENTIVE ARE GRAPH ATTENTION NETWORKS?

HOW ATTENTIVE ARE GRAPH ATTENTION NETWORKS?

31 Jan 2022 | Shaked Brody, Uri Alon, Eran Yahav
Graph Attention Networks (GATs) are a popular GNN architecture for graph-based representation learning. However, this paper shows that GATs compute a limited form of attention called static attention, where the ranking of attention scores is global and unconditioned on the query node. This limits GAT's expressive power, as it cannot handle certain graph problems that require dynamic attention. To address this, the authors propose GATv2, a modified version of GAT that computes dynamic attention by changing the order of operations. GATv2 is more expressive than GAT and performs better across multiple benchmarks, including node-, link-, and graph-prediction tasks. The paper also demonstrates that GATv2 is more robust to noise and can handle complex relational problems. The authors conclude that GATv2 is a more powerful and flexible graph attention mechanism than GAT.Graph Attention Networks (GATs) are a popular GNN architecture for graph-based representation learning. However, this paper shows that GATs compute a limited form of attention called static attention, where the ranking of attention scores is global and unconditioned on the query node. This limits GAT's expressive power, as it cannot handle certain graph problems that require dynamic attention. To address this, the authors propose GATv2, a modified version of GAT that computes dynamic attention by changing the order of operations. GATv2 is more expressive than GAT and performs better across multiple benchmarks, including node-, link-, and graph-prediction tasks. The paper also demonstrates that GATv2 is more robust to noise and can handle complex relational problems. The authors conclude that GATv2 is a more powerful and flexible graph attention mechanism than GAT.
Reach us at info@study.space
[slides and audio] How Attentive are Graph Attention Networks%3F