Search Results for author: Hwichan Kim

Found 5 papers, 0 papers with code

TMU NMT System with Japanese BART for the Patent task of WAT 2021

no code implementations ACL (WAT) 2021 Hwichan Kim, Mamoru Komachi

In this paper, we introduce our TMU Neural Machine Translation (NMT) system submitted for the Patent task (Korean Japanese and English Japanese) of 8th Workshop on Asian Translation (Nakazawa et al., 2021).

Machine Translation NMT +1

Korean-to-Japanese Neural Machine Translation System using Hanja Information

no code implementations AACL (WAT) 2020 Hwichan Kim, Tosho Hirasawa, Mamoru Komachi

In this paper, we describe our TMU neural machine translation (NMT) system submitted for the Patent task (Korean→Japanese) of the 7th Workshop on Asian Translation (WAT 2020, Nakazawa et al., 2020).

Machine Translation NMT +1

A Single Linear Layer Yields Task-Adapted Low-Rank Matrices

no code implementations22 Mar 2024 Hwichan Kim, Shota Sasaki, Sho Hoshino, Ukyo Honda

To confirm this hypothesis, we devise a method named Conditionally Parameterized LoRA (CondLoRA) that updates initial weight matrices with low-rank matrices derived from a single linear layer.

Learning How to Translate North Korean through South Korean

no code implementations LREC 2022 Hwichan Kim, Sangwhan Moon, Naoaki Okazaki, Mamoru Komachi

Training a model using North Korean data is the most straightforward approach to solving this problem, but there is insufficient data to train NMT models.

Machine Translation NMT +1

Zero-shot North Korean to English Neural Machine Translation by Character Tokenization and Phoneme Decomposition

no code implementations ACL 2020 Hwichan Kim, Tosho Hirasawa, Mamoru Komachi

The primary limitation of North Korean to English translation is the lack of a parallel corpus; therefore, high translation accuracy cannot be achieved.

Machine Translation Translation

Cannot find the paper you are looking for? You can Submit a new open access paper.