Dependency Parsing

322 papers with code • 14 benchmarks • 14 datasets

Dependency parsing is the task of extracting a dependency parse of a sentence that represents its grammatical structure and defines the relationships between "head" words and words, which modify those heads.

Example:

     root
      |
      | +-------dobj---------+
      | |                    |
nsubj | |   +------det-----+ | +-----nmod------+
+--+  | |   |              | | |               |
|  |  | |   |      +-nmod-+| | |      +-case-+ |
+  |  + |   +      +      || + |      +      | |
I  prefer  the  morning   flight  through  Denver

Relations among the words are illustrated above the sentence with directed, labeled arcs from heads to dependents (+ indicates the dependent).

Libraries

Use these libraries to find Dependency Parsing models and implementations

Latest papers with no code

Extracting Relational Triples Based on Graph Recursive Neural Network via Dynamic Feedback Forest Algorithm

no code yet • 22 Aug 2023

Extracting relational triples (subject, predicate, object) from text enables the transformation of unstructured text data into structured knowledge.

Linguistically-Informed Neural Architectures for Lexical, Syntactic and Semantic Tasks in Sanskrit

no code yet • 17 Aug 2023

We identify four fundamental tasks, which are crucial for developing a robust NLP technology for Sanskrit: word segmentation, dependency parsing, compound type identification, and poetry analysis.

Enriching the NArabizi Treebank: A Multifaceted Approach to Supporting an Under-Resourced Language

no code yet • 26 Jun 2023

In this paper we address the scarcity of annotated data for NArabizi, a Romanized form of North African Arabic used mostly on social media, which poses challenges for Natural Language Processing (NLP).

A Semi-Autoregressive Graph Generative Model for Dependency Graph Parsing

no code yet • 21 Jun 2023

And the latter assumes these components to be independent so that they can be outputted in a one-shot manner.

Transferring Neural Potentials For High Order Dependency Parsing

no code yet • 18 Jun 2023

The present algorithm propagates biaffine neural scores to the graphical model and by leveraging dual decomposition inference, the overall circuit is trained end-to-end to transfer first order informations to the high order informations.

Pushing the Limits of ChatGPT on NLP Tasks

no code yet • 16 Jun 2023

In this work, we propose a collection of general modules to address these issues, in an attempt to push the limits of ChatGPT on NLP tasks.

Weakly Supervised Visual Question Answer Generation

no code yet • 11 Jun 2023

To address this issue, we propose a weakly-supervised visual question answer generation method that generates a relevant question-answer pairs for a given input image and associated caption.

Data-Efficient French Language Modeling with CamemBERTa

no code yet • 2 Jun 2023

In this paper, we introduce CamemBERTa, a French DeBERTa model that builds upon the DeBERTaV3 architecture and training objective.

Linear-Time Modeling of Linguistic Structure: An Order-Theoretic Perspective

no code yet • 24 May 2023

We show that these exhaustive comparisons can be avoided, and, moreover, the complexity of such tasks can be reduced to linear by casting the relation between tokens as a partial order over the string.

Structured Sentiment Analysis as Transition-based Dependency Parsing

no code yet • 9 May 2023

Structured sentiment analysis (SSA) aims to automatically extract people's opinions from a text in natural language and adequately represent that information in a graph structure.