Entity Embeddings
70 papers with code • 0 benchmarks • 2 datasets
Entity Embeddings is a technique for applying deep learning to tabular data. It involves representing the categorical data of an information systems entity with multiple dimensions.
Benchmarks
These leaderboards are used to track progress in Entity Embeddings
Most implemented papers
Merge and Label: A novel neural network architecture for nested NER
Named entity recognition (NER) is one of the best studied tasks in natural language processing.
Relation-Aware Entity Alignment for Heterogeneous Knowledge Graphs
Entity alignment is the task of linking entities with the same real-world identity from different knowledge graphs (KGs), which has been recently dominated by embedding-based methods.
Jointly Learning Entity and Relation Representations for Entity Alignment
Entity alignment is a viable means for integrating heterogeneous knowledge among different knowledge graphs (KGs).
Aligning Cross-Lingual Entities with Multi-Aspect Information
Multilingual knowledge graphs (KGs), such as YAGO and DBpedia, represent entities in different languages.
KRED: Knowledge-Aware Document Representation for News Recommendations
News articles usually contain knowledge entities such as celebrities or organizations.
E-BERT: Efficient-Yet-Effective Entity Embeddings for BERT
We present a novel way of injecting factual knowledge about entities into the pretrained BERT model (Devlin et al., 2019): We align Wikipedia2Vec entity vectors (Yamada et al., 2016) with BERT's native wordpiece vector space and use the aligned entity vectors as if they were wordpiece vectors.
KEPLER: A Unified Model for Knowledge Embedding and Pre-trained Language Representation
Pre-trained language representation models (PLMs) cannot well capture factual knowledge from text.
MRAEA: An Efficient and Robust Entity Alignment Approach for Cross-lingual Knowledge Graph
To tackle these challenges, we propose a novel Meta Relation Aware Entity Alignment (MRAEA) to directly model cross-lingual entity embeddings by attending over the node's incoming and outgoing neighbors and its connected relations' meta semantics.
Message Passing Query Embedding
The generality of our method allows it to encode a more diverse set of query types in comparison to previous work.
Contextual Parameter Generation for Knowledge Graph Link Prediction
More specifically, we treat relations as the context in which source entities are processed to produce predictions, by using relation embeddings to generate the parameters of a model operating over source entity embeddings.