Constituency Parsing
74 papers with code • 4 benchmarks • 6 datasets
Constituency parsing aims to extract a constituency-based parse tree from a sentence that represents its syntactic structure according to a phrase structure grammar.
Example:
Sentence (S)
|
+-------------+------------+
| |
Noun (N) Verb Phrase (VP)
| |
John +-------+--------+
| |
Verb (V) Noun (N)
| |
sees Bill
Recent approaches convert the parse tree into a sequence following a depth-first traversal in order to be able to apply sequence-to-sequence models to it. The linearized version of the above parse tree looks as follows: (S (N) (VP V N)).
Most implemented papers
Grammar Induction with Neural Language Models: An Unusual Replication
A substantial thread of recent work on latent tree learning has attempted to develop neural network models with parse-valued latent variables and train them on non-parsing tasks, in the hope of having them discover interpretable tree structure.
Direct Output Connection for a High-Rank Language Model
This paper proposes a state-of-the-art recurrent neural network (RNN) language model that combines probability distributions computed not only from a final RNN layer but also from middle layers.
Unlexicalized Transition-based Discontinuous Constituency Parsing
Lexicalized parsing models are based on the assumptions that (i) constituents are organized around a lexical head (ii) bilexical statistics are crucial to solve ambiguities.
Discontinuous Constituency Parsing with a Stack-Free Transition System and a Dynamic Oracle
We introduce a novel transition system for discontinuous constituency parsing.
Unsupervised Latent Tree Induction with Deep Inside-Outside Recursive Auto-Encoders
We introduce the deep inside-outside recursive autoencoder (DIORA), a fully-unsupervised method for discovering syntax that simultaneously learns representations for constituents within the induced tree.
PTB Graph Parsing with Tree Approximation
The Penn Treebank (PTB) represents syntactic structures as graphs due to nonlocal dependencies.
Sequence Labeling Parsing by Learning Across Representations
We use parsing as sequence labeling as a common framework to learn across constituency and dependency syntactic abstractions.
Head-Driven Phrase Structure Grammar Parsing on Penn Treebank
In details, we report 96. 33 F1 of constituent parsing and 97. 20\% UAS of dependency parsing on PTB.
Cross-Domain Generalization of Neural Constituency Parsers
Neural parsers obtain state-of-the-art results on benchmark treebanks for constituency parsing -- but to what degree do they generalize to other domains?
Unsupervised Discourse Constituency Parsing Using Viterbi EM
In this paper, we introduce an unsupervised discourse constituency parsing algorithm.