Search Results for author: Xuanting Chen

Found 5 papers, 2 papers with code

Making Parameter-efficient Tuning More Efficient: A Unified Framework for Classification Tasks

1 code implementation COLING 2022 Xin Zhou, Ruotian Ma, Yicheng Zou, Xuanting Chen, Tao Gui, Qi Zhang, Xuanjing Huang, Rui Xie, Wei Wu

Specifically, we re-formulate both token and sentence classification tasks into a unified language modeling task, and map label spaces of different tasks into the same vocabulary space.

Language Modelling Sentence +2

A Comprehensive Capability Analysis of GPT-3 and GPT-3.5 Series Models

no code implementations18 Mar 2023 Junjie Ye, Xuanting Chen, Nuo Xu, Can Zu, Zekai Shao, Shichun Liu, Yuhan Cui, Zeyang Zhou, Chao Gong, Yang shen, Jie zhou, Siming Chen, Tao Gui, Qi Zhang, Xuanjing Huang

GPT series models, such as GPT-3, CodeX, InstructGPT, ChatGPT, and so on, have gained considerable attention due to their exceptional natural language processing capabilities.

Natural Language Understanding

How Robust is GPT-3.5 to Predecessors? A Comprehensive Study on Language Understanding Tasks

no code implementations1 Mar 2023 Xuanting Chen, Junjie Ye, Can Zu, Nuo Xu, Rui Zheng, Minlong Peng, Jie zhou, Tao Gui, Qi Zhang, Xuanjing Huang

The GPT-3. 5 models have demonstrated impressive performance in various Natural Language Processing (NLP) tasks, showcasing their strong understanding and reasoning capabilities.

Natural Language Inference Natural Language Understanding +1

Learning "O" Helps for Learning More: Handling the Concealed Entity Problem for Class-incremental NER

no code implementations10 Oct 2022 Ruotian Ma, Xuanting Chen, Lin Zhang, Xin Zhou, Junzhe Wang, Tao Gui, Qi Zhang, Xiang Gao, Yunwen Chen

In this work, we conduct an empirical study on the "Unlabeled Entity Problem" and find that it leads to severe confusion between "O" and entities, decreasing class discrimination of old classes and declining the model's ability to learn new classes.

Class Incremental Learning Contrastive Learning +3

Cannot find the paper you are looking for? You can Submit a new open access paper.