Search Results for author: Hantae Kim

Found 3 papers, 0 papers with code

Papago’s Submission for the WMT21 Quality Estimation Shared Task

no code implementations WMT (EMNLP) 2021 Seunghyun Lim, Hantae Kim, Hyunjoong Kim

Our multilingual Quality Estimation system explores the combination of Pretrained Language Models and Multi-task Learning architectures.

Knowledge Distillation Multi-Task Learning +1

DaLC: Domain Adaptation Learning Curve Prediction for Neural Machine Translation

no code implementations Findings (ACL) 2022 Cheonbok Park, Hantae Kim, Ioan Calapodescu, Hyunchang Cho, Vassilina Nikoulina

Domain Adaptation (DA) of Neural Machine Translation (NMT) model often relies on a pre-trained general NMT model which is adapted to the new domain on a sample of in-domain parallel data.

Domain Adaptation Machine Translation +2

Cannot find the paper you are looking for? You can Submit a new open access paper.