Zero-Resource Cross-Lingual Named Entity Recognition

22 Nov 2019  ·  M Saiful Bari, Shafiq Joty, Prathyusha Jwalapuram ·

Recently, neural methods have achieved state-of-the-art (SOTA) results in Named Entity Recognition (NER) tasks for many languages without the need for manually crafted features. However, these models still require manually annotated training data, which is not available for many languages. In this paper, we propose an unsupervised cross-lingual NER model that can transfer NER knowledge from one language to another in a completely unsupervised way without relying on any bilingual dictionary or parallel data. Our model achieves this through word-level adversarial learning and augmented fine-tuning with parameter sharing and feature augmentation. Experiments on five different languages demonstrate the effectiveness of our approach, outperforming existing models by a good margin and setting a new SOTA for each language pair.

PDF Abstract

Datasets


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Low Resource Named Entity Recognition CONLL 2003 Dutch Zero-Resource Transfer From CoNLL-2003 English dataset. F1 score 74.61 # 1
Low Resource Named Entity Recognition CONLL 2003 German Zero-Resource Transfer From CoNLL-2003 English dataset. F1 score 65.24 # 1
Low Resource Named Entity Recognition Conll 2003 Spanish Zero-Resource Cross-lingual Transfer From CoNLL-2003 English dataset. F1 score 75.93 # 1

Methods


No methods listed for this paper. Add relevant methods here