no code implementations • COLING 2022 • Haoxiang Shi, Rongsheng Zhang, Jiaan Wang, Cen Wang, Yinhe Zheng, Tetsuya Sakai
Pre-trained Language Models (PLMs) are the cornerstone of the modern Natural Language Processing (NLP).
no code implementations • 13 Mar 2024 • ZiQi Liang, Haoxiang Shi, Jiawei Wang, Keda Lu
Recurrent neural networks have become a standard modeling technique for sequential data in TTS systems and are widely used.
no code implementations • 5 Aug 2023 • Haoxiang Shi, Sumio Fujita, Tetsuya Sakai
In addition, consistency filtering often struggles to identify retrieval intentions and recognize query and corpus distributions in a target domain.
1 code implementation • 7 Mar 2023 • Jiaan Wang, Yunlong Liang, Fandong Meng, Zengkui Sun, Haoxiang Shi, Zhixu Li, Jinan Xu, Jianfeng Qu, Jie zhou
In detail, we regard ChatGPT as a human evaluator and give task-specific (e. g., summarization) and aspect-specific (e. g., relevance) instruction to prompt ChatGPT to evaluate the generated results of NLG models.
1 code implementation • 18 Jul 2022 • Jiaan Wang, Tingyi Zhang, Haoxiang Shi
Sports game summarization aims to generate sports news based on real-time commentaries.
no code implementations • Asian Chapter of the Association for Computational Linguistics 2020 • Haoxiang Shi, Cen Wang, Tetsuya Sakai
This paper presents a deep neural architecture which applies the siamese convolutional neural network sharing model parameters for learning a semantic similarity metric between two sentences.
no code implementations • 17 Nov 2020 • Haoxiang Shi, Cen Wang
Contrastive learning is a promising approach to unsupervised learning, as it inherits the advantages of well-studied deep models without a dedicated and complex model design.