TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Methods/GATv2

GATv2

Graph Attention Network v2

GraphsIntroduced 20009 papers
Source Paper

Description

The GATv2 operator from the “How Attentive are Graph Attention Networks?” paper, which fixes the static attention problem of the standard GAT layer: since the linear layers in the standard GAT are applied right after each other, the ranking of attended nodes is unconditioned on the query node. In contrast, in GATv2, every node can attend to any other node.

GATv2 scoring function:

ei,j=a⊤LeakyReLU(W[hi ∥ hj])e_{i,j} =\mathbf{a}^{\top}\mathrm{LeakyReLU}\left(\mathbf{W}[\mathbf{h}_i \, \Vert \,\mathbf{h}_j]\right)ei,j​=a⊤LeakyReLU(W[hi​∥hj​])

Papers Using This Method

Graph Neural Networks in Supply Chain Analytics and Optimization: Concepts, Perspectives, Dataset and Benchmarks2024-11-13Perceived Text Relevance Estimation Using Scanpaths and GNNs2024-11-04GATher: Graph Attention Based Predictions of Gene-Disease Links2024-09-23On the Power of Graph Neural Networks and Feature Augmentation Strategies to Classify Social Networks2024-01-11Optimization and Interpretability of Graph Attention Networks for Small Sparse Graph Structures in Automotive Applications2023-05-25Gradient Derivation for Learnable Parameters in Graph Attention Networks2023-04-21A New Graph Node Classification Benchmark: Learning Structure from Histology Cell Graphs2022-11-11Distance-Geometric Graph Attention Network (DG-GAT) for 3D Molecular Geometry2022-07-16How Attentive are Graph Attention Networks?2021-05-30