Exploring the Combination of Contextual Word Embeddings and Knowledge Graph Embeddings

17 Apr 2020  ·  Lea Dieudonat, Kelvin Han, Phyllicia Leavitt, Esteban Marquer ·

``Classical'' word embeddings, such as Word2Vec, have been shown to capture the semantics of words based on their distributional properties. However, their ability to represent the different meanings that a word may have is limited. Such approaches also do not explicitly encode relations between entities, as denoted by words. Embeddings of knowledge bases (KB) capture the explicit relations between entities denoted by words, but are not able to directly capture the syntagmatic properties of these words. To our knowledge, recent research have focused on representation learning that augment the strengths of one with the other. In this work, we begin exploring another approach using contextual and KB embeddings jointly at the same level and propose two tasks -- an entity typing and a relation typing task -- that evaluate the performance of contextual and KB embeddings. We also evaluated a concatenated model of contextual and KB embeddings with these two tasks, and obtain conclusive results on the first task. We hope our work may contribute as a basis for models and datasets that develop in the direction of this approach.

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here