Dependency Parsing
322 papers with code • 15 benchmarks • 14 datasets
Dependency parsing is the task of extracting a dependency parse of a sentence that represents its grammatical structure and defines the relationships between "head" words and words, which modify those heads.
Example:
root
|
| +-------dobj---------+
| | |
nsubj | | +------det-----+ | +-----nmod------+
+--+ | | | | | | |
| | | | | +-nmod-+| | | +-case-+ |
+ | + | + + || + | + | |
I prefer the morning flight through Denver
Relations among the words are illustrated above the sentence with directed, labeled arcs from heads to dependents (+ indicates the dependent).
Libraries
Use these libraries to find Dependency Parsing models and implementationsDatasets
Subtasks
Latest papers with no code
A Morphology-Based Investigation of Positional Encodings
How does the importance of positional encoding in pre-trained language models (PLMs) vary across languages with different morphological complexity?
Empirical Analysis for Unsupervised Universal Dependency Parse Tree Aggregation
Dependency parsing is an essential task in NLP, and the quality of dependency parsers is crucial for many downstream tasks.
MRL Parsing Without Tears: The Case of Hebrew
Syntactic parsing remains a critical tool for relation extraction and information extraction, especially in resource-scarce languages where LLMs are lacking.
Hybrid Human-LLM Corpus Construction and LLM Evaluation for Rare Linguistic Phenomena
Argument Structure Constructions (ASCs) are one of the most well-studied construction groups, providing a unique opportunity to demonstrate the usefulness of Construction Grammar (CxG).
NLPre: a revised approach towards language-centric benchmarking of Natural Language Preprocessing systems
Aware of the shortcomings of existing NLPre evaluation approaches, we investigate a novel method of reliable and fair evaluation and performance reporting.
Cross-lingual Transfer Learning for Javanese Dependency Parsing
While TL only uses a source language to pre-train the model, the HTL method uses a source language and an intermediate language in the learning process.
From Dialogue to Diagram: Task and Relationship Extraction from Natural Language for Accelerated Business Process Prototyping
The automatic transformation of verbose, natural language descriptions into structured process models remains a challenge of significant complexity - This paper introduces a contemporary solution, where central to our approach, is the use of dependency parsing and Named Entity Recognition (NER) for extracting key elements from textual descriptions.
Augmenty: A Python Library for Structured Text Augmentation
Augmnety is a Python library for structured text augmentation.
Syntax-Guided Transformers: Elevating Compositional Generalization and Grounding in Multimodal Environments
Compositional generalization, the ability of intelligent models to extrapolate understanding of components to novel compositions, is a fundamental yet challenging facet in AI research, especially within multimodal environments.
ChatGPT is a Potential Zero-Shot Dependency Parser
Pre-trained language models have been widely used in dependency parsing task and have achieved significant improvements in parser performance.