Neural Class Expression Synthesis

16 Nov 2021  ·  N'Dah Jean Kouagou, Stefan Heindorf, Caglar Demir, Axel-Cyrille Ngonga Ngomo ·

Most existing approaches for class expression learning in description logics are search algorithms. As the search space of these approaches is infinite, they often fail to scale to large learning problems. Our main intuition is that class expression learning can be regarded as a translation problem. Based thereupon, we propose a new family of class expression learning approaches which we dub neural class expression synthesis. Instances of this new family circumvent the high search costs entailed by current algorithms by translating training examples into class expressions in a fashion akin to machine translation solutions. Consequently, they are not subject to the runtime limitations of search-based approaches post training. We study three instances of this novel family of approaches to synthesize class expressions from sets of positive and negative examples. An evaluation of our approach on four benchmark datasets suggests that it can effectively synthesize high-quality class expressions with respect to the input examples in approximately one second on average. Moreover, a comparison to other state-of-the-art approaches suggests that we achieve better F-measures on large datasets. For reproducibility purposes, we provide our implementation as well as pretrained models in our public GitHub repository at https://github.com/fosterreproducibleresearch/NCES

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here