Cross-lingual zero-shot dependency parsing

3 papers with code • 1 benchmarks • 1 datasets

Cross-lingual zero-shot parsing is the task of inferring the dependency parse of sentences from one language without any labeled training trees for that language.

Description from NLP Progress

Most implemented papers

Cross-Lingual Alignment of Contextual Word Embeddings, with Applications to Zero-shot Dependency Parsing

TalSchuster/CrossLingualELMo NAACL 2019

We introduce a novel method for multilingual transfer that utilizes deep contextual embeddings, pretrained in an unsupervised fashion.

Many Languages, One Parser

clab/language-universal-parser TACL 2016

We train one multilingual model for dependency parsing and use it to parse sentences in several languages.

On the Relation between Syntactic Divergence and Zero-Shot Performance

ofirarviv/improving-ud EMNLP 2021

We explore the link between the extent to which syntactic relations are preserved in translation and the ease of correctly constructing a parse tree in a zero-shot setting.