no code implementations • 5 May 2024 • Chengpeng Fu, Xiaocheng Feng, Yichong Huang, Wenshuai Huo, Baohang Li, Hui Wang, Bin Qin, Ting Liu
Leveraging large language models for machine translation has demonstrated promising results.
no code implementations • 19 Apr 2024 • Yichong Huang, Xiaocheng Feng, Baohang Li, Yang Xiang, Hui Wang, Bing Qin, Ting Liu
To address this challenge, DEEPEN maps the probability distribution of each model from the probability space to a universe relative space based on the relative representation theory, and performs aggregation.
no code implementations • 10 Jan 2024 • Yichong Huang, Xiaocheng Feng, Baohang Li, Chengpeng Fu, Wenshuai Huo, Ting Liu, Bing Qin
To align the translation-specific understanding to the general one, we propose a novel translation process xIoD (Cross-Lingual Interpretation of Difficult words), explicitly incorporating the general understanding on the content incurring inconsistent understanding to guide the translation.
1 code implementation • 25 May 2023 • Yichong Huang, Xiaocheng Feng, Xinwei Geng, Baohang Li, Bing Qin
Multilingual neural machine translation has witnessed remarkable progress in recent years.
1 code implementation • 3 May 2022 • Yichong Huang, Xiaocheng Feng, Xinwei Geng, Bing Qin
In this paper, we propose a novel training strategy named LSSD (Language-Specific Self-Distillation), which can alleviate the convergence inconsistency and help MNMT models achieve the best performance on each language pair simultaneously.
1 code implementation • 30 Apr 2021 • Yichong Huang, Xiachong Feng, Xiaocheng Feng, Bing Qin
Recently, various neural encoder-decoder models pioneered by Seq2Seq framework have been proposed to achieve the goal of generating more abstractive summaries by learning to map input text to output text.