Constituency Parsing

73 papers with code • 4 benchmarks • 6 datasets

Constituency parsing aims to extract a constituency-based parse tree from a sentence that represents its syntactic structure according to a phrase structure grammar.

Example:

             Sentence (S)
                 |
   +-------------+------------+
   |                          |
 Noun (N)                Verb Phrase (VP)
   |                          |
 John                 +-------+--------+
                      |                |
                    Verb (V)         Noun (N)
                      |                |
                    sees              Bill

Recent approaches convert the parse tree into a sequence following a depth-first traversal in order to be able to apply sequence-to-sequence models to it. The linearized version of the above parse tree looks as follows: (S (N) (VP V N)).

Most implemented papers

What Do Recurrent Neural Network Grammars Learn About Syntax?

clab/rnng EACL 2017

We investigate what information they learn, from a linguistic perspective, through various ablations to the model and the data, and by augmenting the model with an attention mechanism (GA-RNNG) to enable closer inspection.

Span-Based Constituency Parsing with a Structure-Label System and Provably Optimal Dynamic Oracles

jhcross/span-parser EMNLP 2016

Parsing accuracy using efficient greedy transition systems has improved dramatically in recent years thanks to neural networks.

Multilingual Lexicalized Constituency Parsing with Word-Level Auxiliary Tasks

mcoavoux/mtg EACL 2017

We introduce a constituency parser based on a bi-LSTM encoder adapted from recent work (Cross and Huang, 2016b; Kiperwasser and Goldberg, 2016), which can incorporate a lower level character biLSTM (Ballesteros et al., 2015; Plank et al., 2016).

Parsing with Traces: An $O(n^4)$ Algorithm and a Structural Representation

jkkummerfeld/1ec-graph-parser 13 Jul 2017

General treebank analyses are graph structured, but parsers are typically restricted to tree structures for efficiency and modeling reasons.

A Generative Parser with a Discriminative Recognition Algorithm

cheng6076/virnng ACL 2017

Generative models defining joint distributions over parse trees and sentences are useful for parsing and language modeling, but impose restrictions on the scope of features and are often outperformed by discriminative models.

What's Going On in Neural Constituency Parsers? An Analysis

dgaddy/parser-analysis NAACL 2018

A number of differences have emerged between modern and classic approaches to constituency parsing in recent years, with structural components like grammars and feature-rich lexicons becoming less central while recurrent neural network representations rise in popularity.

Gaussian Mixture Latent Vector Grammars

zhaoyanpeng/lveg ACL 2018

We introduce Latent Vector Grammars (LVeGs), a new framework that extends latent variable grammars such that each nonterminal symbol is associated with a continuous vector space representing the set of (infinitely many) subtypes of the nonterminal.

An Empirical Study of Building a Strong Baseline for Constituency Parsing

nttcslab-nlp/strong_s2s_baseline_parser ACL 2018

This paper investigates the construction of a strong baseline based on general purpose sequence-to-sequence models for constituency parsing.