no code implementations • Findings (ACL) 2022 • Yuxuan Gu, Xiaocheng Feng, Sicheng Ma, Jiaming Wu, Heng Gong, Bing Qin
Weighted decoding methods composed of the pretrained language model (LM) and the controller have achieved promising results for controllable text generation.
no code implementations • 20 Feb 2023 • Weihong Zhong, Mao Zheng, Duyu Tang, Xuan Luo, Heng Gong, Xiaocheng Feng, Bing Qin
Although large-scale video-language pre-training models, which usually build a global alignment between the video and the text, have achieved remarkable progress on various downstream tasks, the idea of adopting fine-grained information during the pre-training stage is not well explored.
1 code implementation • 16 Dec 2022 • Yuxuan Gu, Xiaocheng Feng, Sicheng Ma, Lingyuan Zhang, Heng Gong, Weihong Zhong, Bing Qin
Previous work on controllable text generation has explored the idea of control from the latent space, such as optimizing a representation with attribute-related classifiers or sampling a representation from relevant discrete samples.
1 code implementation • 6 Oct 2022 • Yuxuan Gu, Xiaocheng Feng, Sicheng Ma, Lingyuan Zhang, Heng Gong, Bing Qin
Multi-aspect controllable text generation is a more challenging and practical task than single-aspect control.
1 code implementation • COLING 2020 • Heng Gong, Yawei Sun, Xiaocheng Feng, Bing Qin, Wei Bi, Xiaojiang Liu, Ting Liu
Although neural table-to-text models have achieved remarkable progress with the help of large-scale datasets, they suffer insufficient learning problem with limited training data.
1 code implementation • Findings of the Association for Computational Linguistics 2020 • Heng Gong, Wei Bi, Xiaocheng Feng, Bing Qin, Xiaojiang Liu, Ting Liu
Neural table-to-text models, which select and order salient data, as well as verbalizing them fluently via surface realization, have achieved promising progress.
1 code implementation • 24 Feb 2020 • Xiaocheng Feng, Yawei Sun, Bing Qin, Heng Gong, Yibo Sun, Wei Bi, Xiaojiang Liu, Ting Liu
In this paper, we focus on a new practical task, document-scale text content manipulation, which is the opposite of text style transfer and aims to preserve text styles while altering the content.
1 code implementation • IJCNLP 2019 • Heng Gong, Xiaocheng Feng, Bing Qin, Ting Liu
To address aforementioned problems, not only do we model each table cell considering other records in the same row, we also enrich table's representation by modeling each table cell in context of other cells in the same column or with historical (time dimension) data respectively.
no code implementations • E2E NLG Challenge System Descriptions 2017 • Heng Gong
This paper describes the primary system submitted by the author to the E2E NLG Challenge on the E2E Dataset (Novikova et al. (2017)).
Ranked #10 on Data-to-Text Generation on E2E NLG Challenge