Text Generation
1503 papers with code • 21 benchmarks • 115 datasets
Text Generation is the task of generating text with the goal of appearing indistinguishable to human-written text. This task is more formally known as "natural language generation" in the literature.
Text generation can be addressed with Markov processes or deep generative models like LSTMs. Recently, some of the most advanced methods for text generation include BART, GPT and other GAN-based approaches. Text generation systems are evaluated either through human ratings or automatic evaluation metrics like METEOR, ROUGE, and BLEU.
Further readings:
( Image credit: Adversarial Ranking for Language Generation )
Libraries
Use these libraries to find Text Generation models and implementationsDatasets
Subtasks
- Dialogue Generation
- Data-to-Text Generation
- Multi-Document Summarization
- Text Style Transfer
- Text Style Transfer
- Story Generation
- Paraphrase Generation
- Spelling Correction
- Table-to-Text Generation
- Headline Generation
- Conditional Text Generation
- Visual Storytelling
- Text Infilling
- Distractor Generation
- Question-Answer-Generation
- News Generation
- Story Completion
- Code Documentation Generation
- Concept-To-Text Generation
- Paper generation
- Hint Generation
- Profile Generation
- Sonnet Generation
- Fact-based Text Editing
- Rules-of-thumb Generation
- Molecular description generation
- Natural Language Landmark Navigation Instructions Generation
Latest papers with no code
LLM-Personalize: Aligning LLM Planners with Human Preferences via Reinforced Self-Training for Housekeeping Robots
We introduce LLM-Personalize, a novel framework with an optimization pipeline designed to personalize LLM planners for household robotics.
Navigating the Path of Writing: Outline-guided Text Generation with Large Language Models
Large Language Models (LLMs) have significantly impacted the writing process, enabling collaborative content creation and enhancing productivity.
Context-Enhanced Language Models for Generating Multi-Paper Citations
This research underscores the potential of harnessing LLMs for citation generation, opening a compelling avenue for exploring the intricate connections between scientific documents.
Parameter Efficient Fine Tuning: A Comprehensive Analysis Across Applications
The rise of deep learning has marked significant progress in fields such as computer vision, natural language processing, and medical imaging, primarily through the adaptation of pre-trained models for specific tasks.
Parameter Efficient Diverse Paraphrase Generation Using Sequence-Level Knowledge Distillation
They demonstrate faster inference times and the ability to generate diverse paraphrases of comparable quality.
Can We Catch the Elephant? The Evolvement of Hallucination Evaluation on Natural Language Generation: A Survey
Hallucination in Natural Language Generation (NLG) is like the elephant in the room, obvious but often overlooked until recent achievements significantly improved the fluency and grammatical accuracy of generated text.
iRAG: An Incremental Retrieval Augmented Generation System for Videos
Use of RAG for combined understanding of multimodal data such as text, images and videos is appealing but two critical limitations exist: one-time, upfront capture of all content in large multimodal data as text descriptions entails high processing times, and not all information in the rich multimodal data is typically in the text descriptions.
From $r$ to $Q^*$: Your Language Model is Secretly a Q-Function
Standard RLHF deploys reinforcement learning in a specific token-level MDP, while DPO is derived as a bandit problem in which the whole response of the model is treated as a single arm.
Prompt-Guided Generation of Structured Chest X-Ray Report Using a Pre-trained LLM
Our method introduces a prompt-guided approach to generate structured chest X-ray reports using a pre-trained large language model (LLM).
Related Work and Citation Text Generation: A Survey
To convince readers of the novelty of their research paper, authors must perform a literature review and compose a coherent story that connects and relates prior works to the current work.