Dependency Parsing

323 papers with code • 14 benchmarks • 14 datasets

Dependency parsing is the task of extracting a dependency parse of a sentence that represents its grammatical structure and defines the relationships between "head" words and words, which modify those heads.

Example:

     root
      |
      | +-------dobj---------+
      | |                    |
nsubj | |   +------det-----+ | +-----nmod------+
+--+  | |   |              | | |               |
|  |  | |   |      +-nmod-+| | |      +-case-+ |
+  |  + |   +      +      || + |      +      | |
I  prefer  the  morning   flight  through  Denver

Relations among the words are illustrated above the sentence with directed, labeled arcs from heads to dependents (+ indicates the dependent).

Libraries

Use these libraries to find Dependency Parsing models and implementations

Latest papers with no code

Linear-Time Modeling of Linguistic Structure: An Order-Theoretic Perspective

no code yet • 24 May 2023

We show that these exhaustive comparisons can be avoided, and, moreover, the complexity of such tasks can be reduced to linear by casting the relation between tokens as a partial order over the string.

Structured Sentiment Analysis as Transition-based Dependency Parsing

no code yet • 9 May 2023

Structured sentiment analysis (SSA) aims to automatically extract people's opinions from a text in natural language and adequately represent that information in a graph structure.

DORIC : Domain Robust Fine-Tuning for Open Intent Clustering through Dependency Parsing

no code yet • 17 Mar 2023

We present our work on Track 2 in the Dialog System Technology Challenges 11 (DSTC11).

Dual-Attention Model for Aspect-Level Sentiment Classification

no code yet • 14 Mar 2023

I propose a novel dual-attention model(DAM) for aspect-level sentiment classification.

Focusing On Targets For Improving Weakly Supervised Visual Grounding

no code yet • 22 Feb 2023

Weakly supervised visual grounding aims to predict the region in an image that corresponds to a specific linguistic query, where the mapping between the target object and query is unknown in the training stage.

Syntactic Structure Processing in the Brain while Listening

no code yet • 16 Feb 2023

In this study, we investigate the predictive power of the brain encoding models in three settings: (i) individual performance of the constituency and dependency syntactic parsing based embedding methods, (ii) efficacy of these syntactic parsing based embedding methods when controlling for basic syntactic signals, (iii) relative effectiveness of each of the syntactic embedding methods when controlling for the other.

Zero-shot cross-lingual transfer language selection using linguistic similarity

no code yet • 31 Jan 2023

This allows us to select a more suitable transfer language which can be used to better leverage knowledge from high-resource languages in order to improve the performance of language applications lacking data.

Weakly Supervised Headline Dependency Parsing

no code yet • 25 Jan 2023

English news headlines form a register with unique syntactic properties that have been documented in linguistics literature since the 1930s.

SGRAM: Improving Scene Graph Parsing via Abstract Meaning Representation

no code yet • 17 Oct 2022

To this end, we design a simple yet effective two-stage scene graph parsing framework utilizing abstract meaning representation, SGRAM (Scene GRaph parsing via Abstract Meaning representation): 1) transforming a textual description of an image into an AMR graph (Text-to-AMR) and 2) encoding the AMR graph into a Transformer-based language model to generate a scene graph (AMR-to-SG).

ATP: A holistic attention integrated approach to enhance ABSA

no code yet • 4 Aug 2022

In the case of ABSA, aspect position plays a vital role.