Low-Resource Neural Machine Translation
22 papers with code • 1 benchmarks • 4 datasets
Low-resource machine translation is the task of machine translation on a low-resource language where large data may not be available.
Latest papers with no code
Self-Augmented In-Context Learning for Unsupervised Word Translation
Recent work has shown that, while large language models (LLMs) demonstrate strong word translation or bilingual lexicon induction (BLI) capabilities in few-shot setups, they still cannot match the performance of 'traditional' mapping-based approaches in the unsupervised scenario where no seed translation pairs are available, especially for lower-resource languages.
Joint Dropout: Improving Generalizability in Low-Resource Neural Machine Translation through Phrase Pair Variables
Despite the tremendous success of Neural Machine Translation (NMT), its performance on low-resource language pairs still remains subpar, partly due to the limited ability to handle previously unseen inputs, i. e., generalization.
Exploiting Multilingualism in Low-resource Neural Machine Translation via Adversarial Learning
Generative Adversarial Networks (GAN) offer a promising approach for Neural Machine Translation (NMT).
COMET-QE and Active Learning for Low-Resource Machine Translation
Active learning aims to deliver maximum benefit when resources are scarce.
Cost-Effective Training in Low-Resource Neural Machine Translation
Although AL is shown to be helpful with large budgets, it is not enough to build high-quality translation systems in these low-resource conditions.
Cost-Effective Training in Low-Resource Neural Machine Translation
Although AL is shown to be helpful with large budgets, it is not enough to build high-quality translation systems in these low-resource conditions.
Machine Translation of Low-Resource Indo-European Languages
In this work, we investigate methods for the challenging task of translating between low-resource language pairs that exhibit some level of similarity.
A Survey on Low-Resource Neural Machine Translation
Neural approaches have achieved state-of-the-art accuracy on machine translation but suffer from the high cost of collecting large scale parallel data.
Alternated Training with Synthetic and Authentic Data for Neural Machine Translation
In this work, we propose alternated training with synthetic and authentic data for NMT.
Self-supervised and Supervised Joint Training for Resource-rich Machine Translation
Self-supervised pre-training of text representations has been successfully applied to low-resource Neural Machine Translation (NMT).