no code implementations • Findings (NAACL) 2022 • Jiarun Wu, Qingliang Chen, Zeguan Xiao, Yuliang Gu, Mengsi Sun
Pre-trained language models have shown great success in multiple downstream tasks.
no code implementations • 13 Mar 2024 • Zeguan Xiao, Yan Yang, Guanhua Chen, Yun Chen
In this work, we propose Tastle, a novel black-box jailbreak framework for automated red teaming of LLMs.
no code implementations • EMNLP 2021 • Zeguan Xiao, Jiarun Wu, Qingliang Chen, Congjian Deng
Graph-based Aspect-based Sentiment Classification (ABSC) approaches have yielded state-of-the-art results, expecially when equipped with contextual word embedding from pre-training language models (PLMs).