Low-Resource Neural Machine Translation
23 papers with code • 1 benchmarks • 4 datasets
Low-resource machine translation is the task of machine translation on a low-resource language where large data may not be available.
Latest papers with no code
Optimizing Transformer for Low-Resource Neural Machine Translation
Language pairs with limited amounts of parallel data, also known as low-resource languages, remain a challenge for neural machine translation.
Collective Wisdom: Improving Low-resource Neural Machine Translation using Adaptive Knowledge Distillation
Scarcity of parallel sentence-pairs poses a significant hurdle for training high-quality Neural Machine Translation (NMT) models in bilingually low-resource scenarios.
Enhanced back-translation for low resource neural machine translation using self-training
The synthetic data generated by the improved English-German backward model was used to train a forward model which out-performed another forward model trained using standard back-translation by 2. 7 BLEU.
Leveraging Monolingual Data with Self-Supervision for Multilingual Neural Machine Translation
Over the last few years two promising research directions in low-resource neural machine translation (NMT) have emerged.
An Analysis of Massively Multilingual Neural Machine Translation for Low-Resource Languages
We find that best practices in this domain are highly language-specific: adding more languages to a training set is often better, but too many harms performance{---}the best number depends on the source language.
Adaptively Scheduled Multitask Learning: The Case of Low-Resource Neural Machine Translation
The role of training schedule becomes even more crucial in \textit{biased-MTL} where the goal is to improve one (or a subset) of tasks the most, e. g. translation quality.
Exploiting Multilingualism through Multistage Fine-Tuning for Low-Resource Neural Machine Translation
This paper highlights the impressive utility of multi-parallel corpora for transfer learning in a one-to-many low-resource neural machine translation (NMT) setting.
A Survey of Methods to Leverage Monolingual Data in Low-resource Neural Machine Translation
Neural machine translation has become the state-of-the-art for language pairs with large parallel corpora.
A Universal Parent Model for Low-Resource Neural Machine Translation Transfer
In this work, we present a `universal' pre-trained neural parent model with constant vocabulary that can be used as a starting point for training practically any new low-resource language to a fixed target language.
Bilingual Low-Resource Neural Machine Translation with Round-Tripping: The Case of Persian-Spanish
The quality of Neural Machine Translation (NMT), as a data-driven approach, massively depends on quantity, quality, and relevance of the training dataset.