OTIEA:Ontology-enhanced Triple Intrinsic-Correlation for Cross-lingual Entity Alignment

2 May 2023  ·  Zhishuo Zhang, Chengxiang Tan, Xueyan Zhao, Min Yang, Chaoqun Jiang ·

Cross-lingual and cross-domain knowledge alignment without sufficient external resources is a fundamental and crucial task for fusing irregular data. As the element-wise fusion process aiming to discover equivalent objects from different knowledge graphs (KGs), entity alignment (EA) has been attracting great interest from industry and academic research recent years. Most of existing EA methods usually explore the correlation between entities and relations through neighbor nodes, structural information and external resources. However, the complex intrinsic interactions among triple elements and role information are rarely modeled in these methods, which may lead to the inadequate illustration for triple. In addition, external resources are usually unavailable in some scenarios especially cross-lingual and cross-domain applications, which reflects the little scalability of these methods. To tackle the above insufficiency, a novel universal EA framework (OTIEA) based on ontology pair and role enhancement mechanism via triple-aware attention is proposed in this paper without introducing external resources. Specifically, an ontology-enhanced triple encoder is designed via mining intrinsic correlations and ontology pair information instead of independent elements. In addition, the EA-oriented representations can be obtained in triple-aware entity decoder by fusing role diversity. Finally, a bidirectional iterative alignment strategy is deployed to expand seed entity pairs. The experimental results on three real-world datasets show that our framework achieves a competitive performance compared with baselines.

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods