no code implementations • LREC 2022 • Yixiao Wang, Zied Bouraoui, Luis Espinosa Anke, Steven Schockaert
Many applications crucially rely on the availability of high-quality word vectors.
no code implementations • 25 Mar 2024 • Na Li, Thomas Bailleux, Zied Bouraoui, Steven Schockaert
One line of work treats this task as a Natural Language Inference (NLI) problem, thus relying on the knowledge captured by language models to identify the missing knowledge.
no code implementations • 25 Mar 2024 • Hanane Kteich, Na Li, Usashi Chatterjee, Zied Bouraoui, Steven Schockaert
We show that this leads to embeddings which capture a more diverse range of commonsense properties, and consistently improves results in downstream tasks such as ultra-fine entity typing and ontology completion.
no code implementations • 20 Dec 2023 • Astrid Klipfel, Yaël Fregier, Adlane Sayede, Zied Bouraoui
Discovering crystal structures with specific chemical properties has become an increasingly important focus in material science.
no code implementations • 23 Oct 2023 • Amit Gajbhiye, Zied Bouraoui, Na Li, Usashi Chatterjee, Luis Espinosa Anke, Steven Schockaert
We show that by augmenting the label set with shared properties, we can improve the performance of the state-of-the-art models for this task.
1 code implementation • 7 Jun 2023 • Astrid Klipfel, Yaël Frégier, Adlane Sayede, Zied Bouraoui
One of the greatest challenges facing our society is the discovery of new innovative crystal materials with specific properties.
1 code implementation • 7 Jun 2023 • Astrid Klipfel, Yaël Frégier, Adlane Sayede, Zied Bouraoui
With the aim of training graph-based generative models of new material discovery, we propose an efficient tool to generate cutoff graphs and k-nearest-neighbours graphs of periodic structures within GPU optimization.
no code implementations • 22 May 2023 • Na Li, Zied Bouraoui, Steven Schockaert
In this paper, we show that the performance of existing methods can be improved using a simple technique: we use pre-trained label embeddings to cluster the labels into semantic domains and then treat these domains as additional types.
1 code implementation • 16 May 2023 • Na Li, Hanane Kteich, Zied Bouraoui, Steven Schockaert
Second, concept embeddings should capture the semantic properties of concepts, whereas contextualised word vectors are also affected by other factors.
1 code implementation • 1 Feb 2023 • Astrid Klipfel, Olivier Peltre, Najwa Harrati, Yaël Fregier, Adlane Sayede, Zied Bouraoui
Automatic material discovery with desired properties is a fundamental challenge for material sciences.
no code implementations • 5 May 2022 • Zied Bouraoui, Sebastien Konieczny, Thanh Ma, Nicolas Schwind, Ivan Varzinczak
This paper introduces a novel method for merging open-domain terminological knowledge.
no code implementations • 2 Dec 2021 • Kun Yan, Chenbin Zhang, Jun Hou, Ping Wang, Zied Bouraoui, Shoaib Jameel, Steven Schockaert
A key feature of the multi-label setting is that images often have multiple labels, which typically refer to different regions of the image.
1 code implementation • ACL (RepL4NLP) 2021 • Yixiao Wang, Zied Bouraoui, Luis Espinosa Anke, Steven Schockaert
Second, rather than learning a word vector directly, we use a topic model to partition the contexts in which words appear, and then learn different topic-specific vectors for each word.
no code implementations • 21 May 2021 • Kun Yan, Zied Bouraoui, Ping Wang, Shoaib Jameel, Steven Schockaert
While the use of class names has already been explored in previous work, our approach differs in two key aspects.
no code implementations • 1 Feb 2021 • Kun Yan, Zied Bouraoui, Ping Wang, Shoaib Jameel, Steven Schockaert
The aim of few-shot learning (FSL) is to learn how to recognize image categories from a small number of training examples.
no code implementations • 4 Dec 2020 • Na Li, Zied Bouraoui, Jose Camacho Collados, Luis Espinosa-Anke, Qing Gu, Steven Schockaert
While the success of pre-trained language models has largely eliminated the need for high-quality static word vectors in many NLP applications, such vectors continue to play an important role in tasks where words need to be modelled in the absence of linguistic context.
1 code implementation • COLING 2020 • Rana Alshaikh, Zied Bouraoui, Shelan Jeawak, Steven Schockaert
This is exploited by an associated gating network, which uses pre-trained word vectors to encourage the properties that are modelled by a given embedding to be semantically coherent, i. e. to encourage each of the individual embeddings to capture a meaningful facet.
no code implementations • 13 Dec 2019 • Zied Bouraoui, Antoine Cornuéjols, Thierry Denœux, Sébastien Destercke, Didier Dubois, Romain Guillaume, João Marques-Silva, Jérôme Mengin, Henri Prade, Steven Schockaert, Mathieu Serrurier, Christel Vrain
Some common concerns are identified and discussed such as the types of used representation, the roles of knowledge and data, the lack or the excess of information, or the need for explanations and causal understanding.
no code implementations • 3 Dec 2019 • Zied Bouraoui, Jose Camacho-Collados, Luis Espinosa-Anke, Steven Schockaert
Unfortunately, meaningful regions can be difficult to estimate, especially since we often have few examples of individuals that belong to a given category.
no code implementations • 28 Nov 2019 • Zied Bouraoui, Jose Camacho-Collados, Steven Schockaert
Starting from a few seed instances of a given relation, we first use a large text corpus to find sentences that are likely to express this relation.
no code implementations • CONLL 2019 • Rana Alshaikh, Zied Bouraoui, Steven Schockaert
To address this gap, we analyze how, and to what extent, a given vector space embedding can be decomposed into meaningful facets in an unsupervised fashion.
no code implementations • COLING 2018 • Zied Bouraoui, Shoaib Jameel, Steven Schockaert
Given a set of instances of some relation, the relation induction task is to predict which other word pairs are likely to be related in the same way.
no code implementations • ACL 2018 • Shoaib Jameel, Zied Bouraoui, Steven Schockaert
Word embedding models such as GloVe rely on co-occurrence statistics to learn vector representations of word meaning.
no code implementations • 3 May 2018 • Zied Bouraoui, Steven Schockaert
Several recently proposed methods aim to learn conceptual space representations from large text collections.
no code implementations • 14 Nov 2017 • Shoaib Jameel, Zied Bouraoui, Steven Schockaert
Word embedding models such as GloVe rely on co-occurrence statistics from a large corpus to learn vector representations of word meaning.
no code implementations • 21 Aug 2017 • Zied Bouraoui, Shoaib Jameel, Steven Schockaert
Word embeddings have been found to capture a surprisingly rich amount of syntactic and semantic knowledge.
no code implementations • 18 Feb 2016 • Jean Francois Baget, Salem Benferhat, Zied Bouraoui, Madalina Croitoru, Marie-Laure Mugnier, Odile Papini, Swan Rocher, Karim Tabia
We propose a general framework for inconsistency-tolerant query answering within existential rule setting.