Cross-Lingual NER

23 papers with code • 28 benchmarks • 9 datasets

This task has no description! Would you like to contribute one?

Most implemented papers

Single-/Multi-Source Cross-Lingual NER via Teacher-Student Learning on Unlabeled Data in Target Language

microsoft/vert-papers ACL 2020

However, such methods either are not applicable if the labeled data in the source languages is unavailable, or do not leverage information contained in unlabeled data in the target language.

UniTrans: Unifying Model Transfer and Data Transfer for Cross-Lingual Named Entity Recognition with Unlabeled Data

microsoft/vert-papers 15 Jul 2020

Prior works in cross-lingual named entity recognition (NER) with no/little labeled data fall into two primary categories: model transfer based and data transfer based methods.

Semi-Supervised Disentangled Framework for Transferable Named Entity Recognition

DMIRLAB-Group/SSD 22 Dec 2020

In the proposed framework, the domain-specific information is integrated with the domain-specific latent variables by using a domain predictor.

Towards Multi-Sense Cross-Lingual Alignment of Contextual Embeddings

ntunlp/multisense_embedding_alignment COLING 2022

We operationalize our framework by first proposing a novel sense-aware cross entropy loss to model word senses explicitly.

AdvPicker: Effectively Leveraging Unlabeled Data via Adversarial Discriminator for Cross-Lingual NER

microsoft/vert-papers ACL 2021

Neural methods have been shown to achieve high performance in Named Entity Recognition (NER), but rely on costly high-quality labeled data for training, which is not always available across languages.

MELM: Data Augmentation with Masked Entity Language Modeling for Low-Resource NER

randyzhouran/melm ACL 2022

Data augmentation is an effective solution to data scarcity in low-resource scenarios.

An Unsupervised Multiple-Task and Multiple-Teacher Model for Cross-lingual Named Entity Recognition

BDBC-KG-NLP/ACL2022_MTMT ACL ARR November 2021

In this study, based on the knowledge distillation framework and multi-task learning, we introduce the similarity metric model as an auxiliary task to improve the cross-lingual NER performance on target domain.

CROP: Zero-shot Cross-lingual Named Entity Recognition with Multilingual Labeled Sequence Translation

YuweiYin/CROP 13 Oct 2022

Specifically, the target sequence is first translated into the source language and then tagged by a source NER model.

ConNER: Consistency Training for Cross-lingual Named Entity Recognition

randyzhouran/conner 17 Nov 2022

We propose ConNER as a novel consistency training framework for cross-lingual NER, which comprises of: (1) translation-based consistency training on unlabeled target-language data, and (2) dropoutbased consistency training on labeled source-language data.

Frustratingly Easy Label Projection for Cross-lingual Transfer

edchengg/easyproject 28 Nov 2022

Translating training data into many languages has emerged as a practical solution for improving cross-lingual transfer.