Question Answering over Knowledge Base using Language Model Embeddings

17 Oct 2020  ·  Sai Sharath Japa, Rekabdar Banafsheh ·

Knowledge Base, represents facts about the world, often in some form of subsumption ontology, rather than implicitly, embedded in procedural code, the way a conventional computer program does. While there is a rapid growth in knowledge bases, it poses a challenge of retrieving information from them. Knowledge Base Question Answering is one of the promising approaches for extracting substantial knowledge from Knowledge Bases. Unlike web search, Question Answering over a knowledge base gives accurate and concise results, provided that natural language questions can be understood and mapped precisely to an answer in the knowledge base. However, some of the existing embedding-based methods for knowledge base question answering systems ignore the subtle correlation between the question and the Knowledge Base (e.g., entity types, relation paths, and context) and suffer from the Out Of Vocabulary problem. In this paper, we focused on using a pre-trained language model for the Knowledge Base Question Answering task. Firstly, we used Bert base uncased for the initial experiments. We further fine-tuned these embeddings with a two-way attention mechanism from the knowledge base to the asked question and from the asked question to the knowledge base answer aspects. Our method is based on a simple Convolutional Neural Network architecture with a Multi-Head Attention mechanism to represent the asked question dynamically in multiple aspects. Our experimental results show the effectiveness and the superiority of the Bert pre-trained language model embeddings for question answering systems on knowledge bases over other well-known embedding methods.

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods