Generative Relation Linking for Question Answering over Knowledge Bases

Relation linking is essential to enable question answering over knowledge bases. Although there are various efforts to improve relation linking performance, the current state-of-the-art methods do not achieve optimal results, therefore, negatively impacting the overall end-to-end question answering performance. In this work, we propose a novel approach for relation linking framing it as a generative problem facilitating the use of pre-trained sequence-to-sequence models. We extend such sequence-to-sequence models with the idea of infusing structured data from the target knowledge base, primarily to enable these models to handle the nuances of the knowledge base. Moreover, we train the model with the aim to generate a structured output consisting of a list of argument-relation pairs, enabling a knowledge validation step. We compared our method against the existing relation linking systems on four different datasets derived from DBpedia and Wikidata. Our method reports large improvements over the state-of-the-art while using a much simpler model that can be easily adapted to different knowledge bases.

PDF Abstract

Datasets


Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Relation Linking LC-QuAD GenRL F1 84 # 1
Relation Linking LC-QuAD 1.0 GenRL F1 0.6 # 2
Relation Linking QALD-9 GenRL F1 0.53 # 1
Relation Linking SimpleQuestions WD GenRL F1 98 # 1

Methods


No methods listed for this paper. Add relevant methods here