Low-Resource Neural Machine Translation

22 papers with code • 1 benchmarks • 4 datasets

Low-resource machine translation is the task of machine translation on a low-resource language where large data may not be available.

Latest papers with no code

Self-Augmented In-Context Learning for Unsupervised Word Translation

no code yet • 15 Feb 2024

Recent work has shown that, while large language models (LLMs) demonstrate strong word translation or bilingual lexicon induction (BLI) capabilities in few-shot setups, they still cannot match the performance of 'traditional' mapping-based approaches in the unsupervised scenario where no seed translation pairs are available, especially for lower-resource languages.

Joint Dropout: Improving Generalizability in Low-Resource Neural Machine Translation through Phrase Pair Variables

no code yet • 24 Jul 2023

Despite the tremendous success of Neural Machine Translation (NMT), its performance on low-resource language pairs still remains subpar, partly due to the limited ability to handle previously unseen inputs, i. e., generalization.

Exploiting Multilingualism in Low-resource Neural Machine Translation via Adversarial Learning

no code yet • 31 Mar 2023

Generative Adversarial Networks (GAN) offer a promising approach for Neural Machine Translation (NMT).

COMET-QE and Active Learning for Low-Resource Machine Translation

no code yet • 27 Oct 2022

Active learning aims to deliver maximum benefit when resources are scarce.

Cost-Effective Training in Low-Resource Neural Machine Translation

no code yet • 14 Jan 2022

Although AL is shown to be helpful with large budgets, it is not enough to build high-quality translation systems in these low-resource conditions.

Cost-Effective Training in Low-Resource Neural Machine Translation

no code yet • ACL ARR November 2021

Although AL is shown to be helpful with large budgets, it is not enough to build high-quality translation systems in these low-resource conditions.

Machine Translation of Low-Resource Indo-European Languages

no code yet • WMT (EMNLP) 2021

In this work, we investigate methods for the challenging task of translating between low-resource language pairs that exhibit some level of similarity.

A Survey on Low-Resource Neural Machine Translation

no code yet • 9 Jul 2021

Neural approaches have achieved state-of-the-art accuracy on machine translation but suffer from the high cost of collecting large scale parallel data.

Alternated Training with Synthetic and Authentic Data for Neural Machine Translation

no code yet • Findings (ACL) 2021

In this work, we propose alternated training with synthetic and authentic data for NMT.

Self-supervised and Supervised Joint Training for Resource-rich Machine Translation

no code yet • 8 Jun 2021

Self-supervised pre-training of text representations has been successfully applied to low-resource Neural Machine Translation (NMT).