no code implementations • WMT (EMNLP) 2021 • Seunghyun Lim, Hantae Kim, Hyunjoong Kim
Our multilingual Quality Estimation system explores the combination of Pretrained Language Models and Multi-task Learning architectures.
no code implementations • 24 Oct 2022 • Jiyoung Lee, Hantae Kim, Hyunchang Cho, Edward Choi, Cheonbok Park
Multi-domain Neural Machine Translation (NMT) trains a single model with multiple domains.
no code implementations • Findings (ACL) 2022 • Cheonbok Park, Hantae Kim, Ioan Calapodescu, Hyunchang Cho, Vassilina Nikoulina
Domain Adaptation (DA) of Neural Machine Translation (NMT) model often relies on a pre-trained general NMT model which is adapted to the new domain on a sample of in-domain parallel data.