Search Results for author: Aizhan Imankulova

Found 12 papers, 7 papers with code

Studying The Impact Of Document-level Context On Simultaneous Neural Machine Translation

no code implementations MTSummit 2021 Raj Dabre, Aizhan Imankulova, Masahiro Kaneko

To this end and in this paper and we propose wait-k simultaneous document-level NMT where we keep the context encoder as it is and replace the source sentence encoder and target language decoder with their wait-k equivalents.

Machine Translation NMT +2

Gender Bias in Masked Language Models for Multiple Languages

1 code implementation NAACL 2022 Masahiro Kaneko, Aizhan Imankulova, Danushka Bollegala, Naoaki Okazaki

Unfortunately, it was reported that MLMs also learn discriminative biases regarding attributes such as gender and race.

Attribute Sentence

Simultaneous Multi-Pivot Neural Machine Translation

no code implementations15 Apr 2021 Raj Dabre, Aizhan Imankulova, Masahiro Kaneko, Abhisek Chakrabarty

Parallel corpora are indispensable for training neural machine translation (NMT) models, and parallel corpora for most language pairs do not exist or are scarce.

Machine Translation NMT +1

Cross-lingual Transfer Learning for Grammatical Error Correction

no code implementations COLING 2020 Ikumi Yamashita, Satoru Katsumata, Masahiro Kaneko, Aizhan Imankulova, Mamoru Komachi

Cross-lingual transfer learning from high-resource languages (the source models) is effective for training models of low-resource languages (the target models) for various tasks.

Cross-Lingual Transfer Grammatical Error Correction +1

English-to-Japanese Diverse Translation by Combining Forward and Backward Outputs

no code implementations WS 2020 Masahiro Kaneko, Aizhan Imankulova, Tosho Hirasawa, Mamoru Komachi

We introduce our TMU system that is submitted to The 4th Workshop on Neural Generation and Translation (WNGT2020) to English-to-Japanese (En→Ja) track on Simultaneous Translation And Paraphrase for Language Education (STAPLE) shared task.

Machine Translation NMT +2

Towards Multimodal Simultaneous Neural Machine Translation

1 code implementation WMT (EMNLP) 2020 Aizhan Imankulova, Masahiro Kaneko, Tosho Hirasawa, Mamoru Komachi

Simultaneous translation involves translating a sentence before the speaker's utterance is completed in order to realize real-time understanding in multiple languages.

Machine Translation Sentence +1

Japanese-Russian TMU Neural Machine Translation System using Multilingual Model for WAT 2019

no code implementations WS 2019 Aizhan Imankulova, Masahiro Kaneko, Mamoru Komachi

We introduce our system that is submitted to the News Commentary task (Japanese{\textless}-{\textgreater}Russian) of the 6th Workshop on Asian Translation.

Machine Translation NMT +1

Exploiting Out-of-Domain Parallel Data through Multilingual Transfer Learning for Low-Resource Neural Machine Translation

1 code implementation WS 2019 Aizhan Imankulova, Raj Dabre, Atsushi Fujita, Kenji Imamura

This paper proposes a novel multilingual multistage fine-tuning approach for low-resource neural machine translation (NMT), taking a challenging Japanese--Russian pair for benchmarking.

Benchmarking Domain Adaptation +4

Cannot find the paper you are looking for? You can Submit a new open access paper.