Transformers

ERNIE is a transformer-based model consisting of two stacked modules: 1) textual encoder and 2) knowledgeable encoder, which is responsible to integrate extra token-oriented knowledge information into textual information. This layer consists of stacked aggregators, designed for encoding both tokens and entities as well as fusing their heterogeneous features. To integrate this layer of enhancing representations via knowledge, a special pre-training task is adopted for ERNIE - it involves randomly masking token-entity alignments and training the model to predict all corresponding entities based on aligned tokens (aka denoising entity auto-encoder).

Source: ERNIE: Enhanced Representation through Knowledge Integration

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Language Modelling 16 13.45%
Named Entity Recognition (NER) 5 4.20%
Sentence 5 4.20%
Natural Language Inference 5 4.20%
Sentiment Analysis 5 4.20%
Question Answering 4 3.36%
Text Generation 3 2.52%
NER 3 2.52%
Classification 3 2.52%

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories