AMR-to-Text Generation

15 papers with code • 5 benchmarks • 5 datasets

Abstract Meaning Representation (AMR) is a directed graph of labeled concepts and relations that captures sentence semantics. The propositional meaning behind its concepts abstracts away lexical properties. AMR is tree-like in structure as it has a single root node and few children with multiple parents. The goal of AMR-to-Text Generation is to recover the original sentence realization given an AMR. This task can be seen as the reverse of the structured prediction found in AMR parsing.

Source: AMR-to-Text Generation with Cache Transition Systems

Most implemented papers

Investigating Pretrained Language Models for Graph-to-Text Generation

UKPLab/plms-graph2text EMNLP (NLP4ConvAI) 2021

We show that the PLMs BART and T5 achieve new state-of-the-art results and that task-adaptive pretraining strategies improve their performance even further.

Structural Neural Encoders for AMR-to-text Generation

mdtux89/OpenNMT-py-AMR-to-text NAACL 2019

AMR-to-text generation is a problem recently introduced to the NLP community, in which the goal is to generate sentences from Abstract Meaning Representation (AMR) graphs.

Graph Pre-training for AMR Parsing and Generation

muyeby/amrbart ACL 2022

To our knowledge, we are the first to consider pre-training on semantic graphs.

A Graph-to-Sequence Model for AMR-to-Text Generation

freesunshine0316/neural-graph-to-seq-mp ACL 2018

The problem of AMR-to-text generation is to recover a text representing the same meaning as an input AMR graph.

Modeling Graph Structure in Transformer for Better AMR-to-Text Generation

Amazing-J/structural-transformer IJCNLP 2019

Recent studies on AMR-to-text generation often formalize the task as a sequence-to-sequence (seq2seq) learning problem by converting an Abstract Meaning Representation (AMR) graph into a word sequence.

Enhancing AMR-to-Text Generation with Dual Graph Representations

UKPLab/emnlp2019-dualgraph IJCNLP 2019

Generating text from graph-based data, such as Abstract Meaning Representation (AMR), is a challenging task due to the inherent difficulty in how to properly encode the structure of a graph with labeled edges.

Graph Transformer for Graph-to-Sequence Learning

jcyk/gtos 18 Nov 2019

The dominant graph-to-sequence transduction models employ graph neural networks for graph representation learning, where the structural information is reflected by the receptive field of neurons.

Have Your Text and Use It Too! End-to-End Neural Data-to-Text Generation with Semantic Fidelity

amazon-research/datatuner COLING 2020

Our generated text has a significantly better semantic fidelity than the state of the art across all four datasets

GPT-too: A language-model-first approach for AMR-to-text generation

IBM/GPT-too-AMR2text ACL 2020

Meaning Representations (AMRs) are broad-coverage sentence-level semantic graphs.

Online Back-Parsing for AMR-to-Text Generation

muyeby/AMR-Backparsing EMNLP 2020

AMR-to-text generation aims to recover a text containing the same meaning as an input AMR graph.