LiFi: Lightweight Controlled Text Generation with Fine-Grained Control Codes

10 Feb 2024  ·  Chufan Shi, Deng Cai, Yujiu Yang ·

In the rapidly evolving field of text generation, the demand for more precise control mechanisms has become increasingly apparent. To address this need, we present a novel methodology, LIFI, which offers a lightweight approach with fine-grained control for controlled text generation. Unlike previous studies that train pre-trained language models to follow discrete, categorical, and exclusive control codes, LIFI learns controlled text generation under the guidance of continuous, relative, and nonexclusive control codes. These fine-grained codes are automatically derived from an attribute classifier, initially trained with a small amount of labeled data and subsequently employed to label abundant unlabeled data, thus garnering more extensive supervision signals. Moreover, to achieve efficient control, we incorporate the fine-grained control codes with adapters, a parameter- and compute-efficient way to steer a pre-trained language model. We evaluate LIFI on two conventional tasks -- sentiment control and topic control -- and one newly proposed task -- stylistic novel writing. Comprehensive experimental results validate the effectiveness of our proposed methods, demonstrating substantial performance improvements over existing baselines.

PDF Abstract

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here