Quaternion Knowledge Graph Embeddings

NeurIPS 2019  ·  Shuai Zhang, Yi Tay, Lina Yao, Qi Liu ·

In this work, we move beyond the traditional complex-valued representations, introducing more expressive hypercomplex representations to model entities and relations for knowledge graph embeddings. More specifically, quaternion embeddings, hypercomplex-valued embeddings with three imaginary components, are utilized to represent entities. Relations are modelled as rotations in the quaternion space. The advantages of the proposed approach are: (1) Latent inter-dependencies (between all components) are aptly captured with Hamilton product, encouraging a more compact interaction between entities and relations; (2) Quaternions enable expressive rotation in four-dimensional space and have more degree of freedom than rotation in complex plane; (3) The proposed framework is a generalization of ComplEx on hypercomplex space while offering better geometrical interpretations, concurrently satisfying the key desiderata of relational representation learning (i.e., modeling symmetry, anti-symmetry and inversion). Experimental results demonstrate that our method achieves state-of-the-art performance on four well-established knowledge graph completion benchmarks.

PDF Abstract NeurIPS 2019 PDF NeurIPS 2019 Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Link Prediction FB15k QuatE MR 17 # 1
MRR 0.833 # 6
Hits@10 0.900 # 7
Hits@3 0.859 # 1
Hits@1 0.800 # 1
Link Prediction FB15k-237 QuatE MRR 0.348 # 34
Hits@10 0.550 # 11
Hits@3 0.382 # 28
Hits@1 0.248 # 36
MR 87 # 1
Link Prediction WN18 QuatE MRR 0.95 # 9
Hits@10 0.959 # 7
Hits@3 0.954 # 6
Hits@1 0.945 # 8
MR 162 # 3
Link Prediction WN18RR QuatE MRR 0.488 # 26
Hits@10 0.582 # 20
Hits@3 0.508 # 20
Hits@1 0.438 # 36
MR 2314 # 16

Methods


No methods listed for this paper. Add relevant methods here