Cross-Lingual NER
23 papers with code • 28 benchmarks • 9 datasets
Datasets
Most implemented papers
Single-/Multi-Source Cross-Lingual NER via Teacher-Student Learning on Unlabeled Data in Target Language
However, such methods either are not applicable if the labeled data in the source languages is unavailable, or do not leverage information contained in unlabeled data in the target language.
UniTrans: Unifying Model Transfer and Data Transfer for Cross-Lingual Named Entity Recognition with Unlabeled Data
Prior works in cross-lingual named entity recognition (NER) with no/little labeled data fall into two primary categories: model transfer based and data transfer based methods.
Semi-Supervised Disentangled Framework for Transferable Named Entity Recognition
In the proposed framework, the domain-specific information is integrated with the domain-specific latent variables by using a domain predictor.
Towards Multi-Sense Cross-Lingual Alignment of Contextual Embeddings
We operationalize our framework by first proposing a novel sense-aware cross entropy loss to model word senses explicitly.
AdvPicker: Effectively Leveraging Unlabeled Data via Adversarial Discriminator for Cross-Lingual NER
Neural methods have been shown to achieve high performance in Named Entity Recognition (NER), but rely on costly high-quality labeled data for training, which is not always available across languages.
MELM: Data Augmentation with Masked Entity Language Modeling for Low-Resource NER
Data augmentation is an effective solution to data scarcity in low-resource scenarios.
An Unsupervised Multiple-Task and Multiple-Teacher Model for Cross-lingual Named Entity Recognition
In this study, based on the knowledge distillation framework and multi-task learning, we introduce the similarity metric model as an auxiliary task to improve the cross-lingual NER performance on target domain.
CROP: Zero-shot Cross-lingual Named Entity Recognition with Multilingual Labeled Sequence Translation
Specifically, the target sequence is first translated into the source language and then tagged by a source NER model.
ConNER: Consistency Training for Cross-lingual Named Entity Recognition
We propose ConNER as a novel consistency training framework for cross-lingual NER, which comprises of: (1) translation-based consistency training on unlabeled target-language data, and (2) dropoutbased consistency training on labeled source-language data.
Frustratingly Easy Label Projection for Cross-lingual Transfer
Translating training data into many languages has emerged as a practical solution for improving cross-lingual transfer.