A Graph Attention Network (GAT) is a neural network architecture that operates on graph-structured data, leveraging masked self-attentional layers to address the shortcomings of prior methods based on graph convolutions or their approximations. By stacking layers in which nodes are able to attend over their neighborhoods’ features, a GAT enables (implicitly) specifying different weights to different nodes in a neighborhood, without requiring any kind of costly matrix operation (such as inversion) or depending on knowing the graph structure upfront.
See here for an explanation by DGL.
Source: Graph Attention NetworksPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Graph Attention | 69 | 24.21% |
Node Classification | 30 | 10.53% |
Link Prediction | 9 | 3.16% |
Graph Classification | 8 | 2.81% |
Graph Learning | 8 | 2.81% |
Knowledge Graphs | 6 | 2.11% |
Graph Representation Learning | 5 | 1.75% |
Benchmarking | 5 | 1.75% |
General Classification | 5 | 1.75% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |