Search Results for author: Tianhua Tao

Found 6 papers, 4 papers with code

Don’t Take It Literally: An Edit-Invariant Sequence Loss for Text Generation

1 code implementation NAACL 2022 Guangyi Liu, Zichao Yang, Tianhua Tao, Xiaodan Liang, Junwei Bao, Zhen Li, Xiaodong He, Shuguang Cui, Zhiting Hu

Such training objective is sub-optimal when the target sequence is not perfect, e. g., when the target sequence is corrupted with noises, or when only weak sequence supervision is available.

Machine Translation Style Transfer +2

SlimPajama-DC: Understanding Data Combinations for LLM Training

no code implementations19 Sep 2023 Zhiqiang Shen, Tianhua Tao, Liqun Ma, Willie Neiswanger, Zhengzhong Liu, Hongyi Wang, Bowen Tan, Joel Hestness, Natalia Vassilieva, Daria Soboleva, Eric Xing

This paper aims to understand the impacts of various data combinations (e. g., web text, wikipedia, github, books) on the training of large language models using SlimPajama.

Language Models Meet World Models: Embodied Experiences Enhance Language Models

1 code implementation NeurIPS 2023 Jiannan Xiang, Tianhua Tao, Yi Gu, Tianmin Shu, ZiRui Wang, Zichao Yang, Zhiting Hu

While large language models (LMs) have shown remarkable capabilities across numerous tasks, they often struggle with simple reasoning and planning in physical environments, such as understanding object permanence or planning household activities.

On the Learning of Non-Autoregressive Transformers

no code implementations13 Jun 2022 Fei Huang, Tianhua Tao, Hao Zhou, Lei LI, Minlie Huang

Non-autoregressive Transformer (NAT) is a family of text generation models, which aims to reduce the decoding latency by predicting the whole sentences in parallel.

Text Generation

Don't Take It Literally: An Edit-Invariant Sequence Loss for Text Generation

1 code implementation29 Jun 2021 Guangyi Liu, Zichao Yang, Tianhua Tao, Xiaodan Liang, Junwei Bao, Zhen Li, Xiaodong He, Shuguang Cui, Zhiting Hu

Such training objective is sub-optimal when the target sequence is not perfect, e. g., when the target sequence is corrupted with noises, or when only weak sequence supervision is available.

Machine Translation Style Transfer +3

Cannot find the paper you are looking for? You can Submit a new open access paper.