Low-Resource Neural Machine Translation

23 papers with code • 1 benchmarks • 4 datasets

Low-resource machine translation is the task of machine translation on a low-resource language where large data may not be available.

Latest papers with no code

Optimizing Transformer for Low-Resource Neural Machine Translation

no code yet • COLING 2020

Language pairs with limited amounts of parallel data, also known as low-resource languages, remain a challenge for neural machine translation.

Collective Wisdom: Improving Low-resource Neural Machine Translation using Adaptive Knowledge Distillation

no code yet • COLING 2020

Scarcity of parallel sentence-pairs poses a significant hurdle for training high-quality Neural Machine Translation (NMT) models in bilingually low-resource scenarios.

Enhanced back-translation for low resource neural machine translation using self-training

no code yet • 4 Jun 2020

The synthetic data generated by the improved English-German backward model was used to train a forward model which out-performed another forward model trained using standard back-translation by 2. 7 BLEU.

Leveraging Monolingual Data with Self-Supervision for Multilingual Neural Machine Translation

no code yet • ACL 2020

Over the last few years two promising research directions in low-resource neural machine translation (NMT) have emerged.

An Analysis of Massively Multilingual Neural Machine Translation for Low-Resource Languages

no code yet • LREC 2020

We find that best practices in this domain are highly language-specific: adding more languages to a training set is often better, but too many harms performance{---}the best number depends on the source language.

Adaptively Scheduled Multitask Learning: The Case of Low-Resource Neural Machine Translation

no code yet • WS 2019

The role of training schedule becomes even more crucial in \textit{biased-MTL} where the goal is to improve one (or a subset) of tasks the most, e. g. translation quality.

Exploiting Multilingualism through Multistage Fine-Tuning for Low-Resource Neural Machine Translation

no code yet • IJCNLP 2019

This paper highlights the impressive utility of multi-parallel corpora for transfer learning in a one-to-many low-resource neural machine translation (NMT) setting.

A Survey of Methods to Leverage Monolingual Data in Low-resource Neural Machine Translation

no code yet • 1 Oct 2019

Neural machine translation has become the state-of-the-art for language pairs with large parallel corpora.

A Universal Parent Model for Low-Resource Neural Machine Translation Transfer

no code yet • 14 Sep 2019

In this work, we present a `universal' pre-trained neural parent model with constant vocabulary that can be used as a starting point for training practically any new low-resource language to a fixed target language.

Bilingual Low-Resource Neural Machine Translation with Round-Tripping: The Case of Persian-Spanish

no code yet • RANLP 2019

The quality of Neural Machine Translation (NMT), as a data-driven approach, massively depends on quantity, quality, and relevance of the training dataset.