Constituency Parsing
73 papers with code • 4 benchmarks • 6 datasets
Constituency parsing aims to extract a constituency-based parse tree from a sentence that represents its syntactic structure according to a phrase structure grammar.
Example:
Sentence (S)
|
+-------------+------------+
| |
Noun (N) Verb Phrase (VP)
| |
John +-------+--------+
| |
Verb (V) Noun (N)
| |
sees Bill
Recent approaches convert the parse tree into a sequence following a depth-first traversal in order to be able to apply sequence-to-sequence models to it. The linearized version of the above parse tree looks as follows: (S (N) (VP V N)).
Latest papers with no code
Constituents Correspond to Word Sequence Patterns among Sentences with Equivalent Predicate-Argument Structures: Unsupervised Constituency Parsing by Span Matching
In this study, we empirically verify that \textbf{constituents correspond to word sequence patterns in the PAS-equivalent sentence set}.
Targeted aspect-based emotion analysis to detect opportunities and precaution in financial Twitter messages
To the best of our knowledge, no prior work in the literature has addressed this problem despite its practical interest in decision-making, and we are not aware of any previous NLP nor online Machine Learning approaches to TABEA.
Ensemble-Based Unsupervised Discontinuous Constituency Parsing by Tree Averaging
We address unsupervised discontinuous constituency parsing, where we observe a high variance in the performance of the only previous model.
Sketch-Guided Constrained Decoding for Boosting Blackbox Large Language Models without Logit Access
This paper introduces sketch-guided constrained decoding (SGCD), a novel approach to constrained decoding for blackbox LLMs, which operates without access to the logits of the blackbox LLM.
Multistage Collaborative Knowledge Distillation from a Large Language Model for Semi-Supervised Sequence Generation
In this paper, we present the discovery that a student model distilled from a few-shot prompted LLM can commonly generalize better than its teacher to unseen examples on such tasks.
Constituency Parsing using LLMs
Constituency parsing is a fundamental yet unsolved natural language processing task.
DiffCloth: Diffusion Based Garment Synthesis and Manipulation via Structural Cross-modal Semantic Alignment
Cross-modal garment synthesis and manipulation will significantly benefit the way fashion designers generate garments and modify their designs via flexible linguistic interfaces. Current approaches follow the general text-to-image paradigm and mine cross-modal relations via simple cross-attention modules, neglecting the structural correspondence between visual and textual representations in the fashion design domain.
Cross-Lingual Constituency Parsing for Middle High German: A Delexicalized Approach
However, training an automatic syntactic analysis system for ancient languages solely relying on annotated parse data is a formidable task due to the inherent challenges in building treebanks for such languages.
Do Transformers Parse while Predicting the Masked Word?
We also show that the Inside-Outside algorithm is optimal for masked language modeling loss on the PCFG-generated data.
Re-evaluating the Need for Multimodal Signals in Unsupervised Grammar Induction
Are multimodal inputs necessary for grammar induction?