no code implementations • 20 Dec 2021 • Yong Cao, Yukun Feng, Shaohui Kuang, Gu Xu
In almost all text generation applications, word sequences are constructed in a left-to-right (L2R) or right-to-left (R2L) manner, as natural language sentences are written either L2R or R2L.
no code implementations • 1 Jan 2021 • Shaohui Kuang, Heng Yu, Weihua Luo, Qiang Wang
Existing ways either employ extra encoder to encode information from TM or concatenate source sentence and TM sentences as encoder's input.
no code implementations • 5 Apr 2020 • Shanbo Cheng, Shaohui Kuang, Rongxiang Weng, Heng Yu, Changfeng Zhu, Weihua Luo
Compared with only using limited authentic parallel data as training corpus, many studies have proved that incorporating synthetic parallel data, which generated by back translation (BT) or forward translation (FT, or selftraining), into the NMT training process can significantly improve translation quality.
no code implementations • 2 Dec 2019 • Tao Wang, Shaohui Kuang, Deyi Xiong, António Branco
As neural machine translation (NMT) is not easily amenable to explicit correction of errors, incorporating pre-specified translations into NMT is widely regarded as a non-trivial challenge.
no code implementations • 25 Nov 2019 • Qian Cao, Shaohui Kuang, Deyi Xiong
In this paper, we study the problem of enabling neural machine translation (NMT) to reuse previous translations from similar examples in target prediction.
no code implementations • COLING 2018 • Shaohui Kuang, Deyi Xiong
Neural machine translation (NMT) systems are usually trained on a large amount of bilingual sentence pairs and translate one sentence at a time, ignoring inter-sentence information.
1 code implementation • 3 May 2018 • Lifeng Han, Shaohui Kuang
We integrate the Chinese radicals into the NMT model with different settings to address the unseen words challenge in Chinese to English translation.
no code implementations • COLING 2018 • Shaohui Kuang, Deyi Xiong, Weihua Luo, Guodong Zhou
Sentences in a well-formed text are connected to each other via various links to form the cohesive structure of the text.
no code implementations • ACL 2018 • Shaohui Kuang, Junhui Li, António Branco, Weihua Luo, Deyi Xiong
In neural machine translation, a source sequence of words is encoded into a vector from which a target sequence is generated in the decoding phase.