Hierarchical Neighbor Propagation With Bidirectional Graph Attention Network for Relation Prediction

Abstract—The graph attention network (GAT) [1] has started to become a mainstream neural network architecture since 2018, yielding remarkable performance gains in various natural language processing (NLP) tasks. Although GAT has reached the state-of-the-art (SOTA) performance as a recent success in relation prediction in knowledge graph, the current model is still limited by the following two aspects: (1) the existing model only considers the neighbors from the inbound-direction of the given entity, but ignores the rich neighborhood information from outbounddirections; (2) the existing model only uses the k-th hop output to learn the multi-hop embeddings, which leads to the loss of a large amount of early-stage embedding information (e.g., one-hop) at the graph attention step. In this study, we propose a novel bidirectional graph attention network (BiGAT) to learn the hierarchical neighbor propagation. In our proposed BiGAT, an inbound-directional GAT and an outbound-directional GAT are introduced to capture sufficient neighborhood information before propagating the bidirectional neighborhood information to learn the multi-hop feature embeddings in a hierarchical manner. Experiments conducted on the four publicly available datasets show that BiGAT achieves the competitive results in comparison to other SOTA methods.

PDF IEEE/ACM Transactions 2021 PDF
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here