ERNIE-GEN is a multi-flow sequence to sequence pre-training and fine-tuning framework which bridges the discrepancy between training and inference with an infilling generation mechanism and a noise-aware generation method. To make generation closer to human writing patterns, this framework introduces a span-by-span generation flow that trains the model to predict semantically-complete spans consecutively rather than predicting word by word. Unlike existing pre-training methods, ERNIE-GEN incorporates multi-granularity target sampling to construct pre-training data, which enhances the correlation between encoder and decoder.
Source: ERNIE-GEN: An Enhanced Multi-Flow Pre-training and Fine-tuning Framework for Natural Language GenerationPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Abstractive Text Summarization | 1 | 16.67% |
Dialogue Generation | 1 | 16.67% |
Generative Question Answering | 1 | 16.67% |
Question Generation | 1 | 16.67% |
Text Generation | 1 | 16.67% |
Text Summarization | 1 | 16.67% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |