1 code implementation • NAACL 2022 • Huda Hakami, Mona Hakami, Angrosh Mandya, Danushka Bollegala
In this paper, we propose and evaluate several methods to address this problem, where we borrow LDPs from the entity pairs that co-occur in sentences in the corpus (i. e. with mentions entity pairs) to represent entity pairs that do not co-occur in any sentence in the corpus (i. e. without mention entity pairs).
1 code implementation • 27 Apr 2022 • Huda Hakami, Mona Hakami, Angrosh Mandya, Danushka Bollegala
In this paper, we propose and evaluate several methods to address this problem, where we borrow LDPs from the entity pairs that co-occur in sentences in the corpus (i. e. with mention entity pairs) to represent entity pairs that do not co-occur in any sentence in the corpus (i. e. without mention entity pairs).
no code implementations • EACL 2021 • Danushka Bollegala, Huda Hakami, Yuichi Yoshida, Ken-ichi Kawarabayashi
Embedding entities and relations of a knowledge graph in a low-dimensional space has shown impressive performance in predicting missing links between entities.
1 code implementation • 25 Jan 2021 • Danushka Bollegala, Huda Hakami, Yuichi Yoshida, Ken-ichi Kawarabayashi
Embedding entities and relations of a knowledge graph in a low-dimensional space has shown impressive performance in predicting missing links between entities.
no code implementations • ICLR 2019 • Danushka Bollegala, Huda Hakami, Yuichi Yoshida, Ken-ichi Kawarabayashi
Existing methods for learning KGEs can be seen as a two-stage process where (a) entities and relations in the knowledge graph are represented using some linear algebraic structures (embeddings), and (b) a scoring function is defined that evaluates the strength of a relation that holds between two entities using the corresponding relation and entity embeddings.
no code implementations • AKBC 2019 • Huda Hakami, Danushka Bollegala
We model relation representation as a supervised learning problem and learn parametrised operators that map pre-trained word embeddings to relation representations.
no code implementations • COLING 2018 • Huda Hakami, Kohei Hayashi, Danushka Bollegala
We show that, if the word embed- dings are standardised and uncorrelated, such an operator will be independent of bilinear terms, and can be simplified to a linear form, where PairDiff is a special case.
no code implementations • 19 Sep 2017 • Huda Hakami, Danushka Bollegala, Hayashi Kohei
We show that, if the word embeddings are standardised and uncorrelated, such an operator will be independent of bilinear terms, and can be simplified to a linear form, where \PairDiff is a special case.
no code implementations • 4 Sep 2017 • Huda Hakami, Danushka Bollegala
In contrast, a compositional approach for representing relations between words overcomes these issues by using the attributes of each individual word to indirectly compose a representation for the common relations that hold between the two words.