no code implementations • EMNLP (NLP4ConvAI) 2021 • Xinxian Huang, Huang He, Siqi Bao, Fan Wang, Hua Wu, Haifeng Wang
Large-scale conversation models are turning to leveraging external knowledge to improve the factual accuracy in response generation.
1 code implementation • 19 Dec 2022 • Mingzhu Cai, Siqi Bao, Xin Tian, Huang He, Fan Wang, Hua Wu
In this paper, we propose an unsupervised query enhanced approach for knowledge-intensive conversations, namely QKConv.
no code implementations • 2 Nov 2022 • Siqi Bao, Huang He, Jun Xu, Hua Lu, Fan Wang, Hua Wu, Han Zhou, Wenquan Wu, Zheng-Yu Niu, Haifeng Wang
Recently, the practical deployment of open-domain dialogue systems has been plagued by the knowledge issue of information deficiency and factual inaccuracy.
1 code implementation • 14 Oct 2022 • Xin Tian, Yingzhan Lin, Mengfei Song, Siqi Bao, Fan Wang, Huang He, Shuqi Sun, Hua Wu
Firstly, as the query is in the form of natural language and not confined to the schema of the knowledge base, the issue of domain adaption is alleviated remarkably in Q-TOD.
1 code implementation • 30 Aug 2022 • Hua Lu, Siqi Bao, Huang He, Fan Wang, Hua Wu, Haifeng Wang
Many open-domain dialogue models pre-trained with social media comments can generate coherent replies but have difficulties producing engaging responses when interacting with real users.
no code implementations • 8 Mar 2022 • Hua Lu, Zhen Guo, Chanjuan Li, Yunyi Yang, Huang He, Siqi Bao
In recent years, Internet memes have been widely used in online chatting.
no code implementations • 23 Dec 2021 • Xin Tian, Xinxian Huang, Dongfeng He, Yingzhan Lin, Siqi Bao, Huang He, Liankai Huang, Qiang Ju, Xiyuan Zhang, Jian Xie, Shuqi Sun, Fan Wang, Hua Wu, Haifeng Wang
Task-oriented dialogue systems have been plagued by the difficulties of obtaining large-scale and high-quality annotated conversations.
1 code implementation • EMNLP (NLP4ConvAI) 2021 • Xin Tian, Liankai Huang, Yingzhan Lin, Siqi Bao, Huang He, Yunyi Yang, Hua Wu, Fan Wang, Shuqi Sun
In this paper, we propose a novel Amendable Generation for Dialogue State Tracking (AG-DST), which contains a two-pass generation process: (1) generating a primitive dialogue state based on the dialogue of the current turn and the previous dialogue state, and (2) amending the primitive dialogue state from the first pass.
Ranked #1 on Dialogue State Tracking on Wizard-of-Oz
Dialogue State Tracking Multi-domain Dialogue State Tracking +1
3 code implementations • 20 Sep 2021 • Siqi Bao, Huang He, Fan Wang, Hua Wu, Haifeng Wang, Wenquan Wu, Zhihua Wu, Zhen Guo, Hua Lu, Xinxian Huang, Xin Tian, Xinchao Xu, Yingzhan Lin, Zheng-Yu Niu
To explore the limit of dialogue generation pre-training, we present the models of PLATO-XL with up to 11 billion parameters, trained on both Chinese and English social media conversations.
1 code implementation • 6 May 2021 • Siqi Bao, Bingjin Chen, Huang He, Xin Tian, Han Zhou, Fan Wang, Hua Wu, Haifeng Wang, Wenquan Wu, Yingzhan Lin
In this work, we explore the application of PLATO-2 on various dialogue systems, including open-domain conversation, knowledge grounded dialogue, and task-oriented conversation.
1 code implementation • 3 Feb 2021 • Huang He, Hua Lu, Siqi Bao, Fan Wang, Hua Wu, ZhengYu Niu, Haifeng Wang
The Track-1 of DSTC9 aims to effectively answer user requests or questions during task-oriented dialogues, which are out of the scope of APIs/DB.
3 code implementations • Findings (ACL) 2021 • Siqi Bao, Huang He, Fan Wang, Hua Wu, Haifeng Wang, Wenquan Wu, Zhen Guo, Zhibin Liu, Xinchao Xu
To build a high-quality open-domain chatbot, we introduce the effective training process of PLATO-2 via curriculum learning.
3 code implementations • ACL 2020 • Siqi Bao, Huang He, Fan Wang, Hua Wu, Haifeng Wang
Pre-training models have been proved effective for a wide range of natural language processing tasks.
1 code implementation • ACL 2019 • Siqi Bao, Huang He, Fan Wang, Rongzhong Lian, Hua Wu
In this paper, a novel Generation-Evaluation framework is developed for multi-turn conversations with the objective of letting both participants know more about each other.
1 code implementation • 11 Aug 2018 • Di Jiang, Yuanfeng Song, Rongzhong Lian, Siqi Bao, Jinhua Peng, Huang He, Hua Wu
In order to relieve burdens of software engineers without knowledge of Bayesian networks, Familia is able to conduct automatic parameter inference for a variety of topic models.
no code implementations • 18 May 2018 • Zheng Lei, Lu Chun-Ta, He Lifang, Xie Sihong, Noroozi Vahid, Huang He, Yu Philip S.
In this paper, we study the problem of modeling users' diverse interests.