no code implementations • COLING 2022 • Peichao Lai, Feiyang Ye, Lin Zhang, Zhiwei Chen, Yanggeng Fu, Yingjie Wu, Yilei Wang
Achieving good performance on few-shot or zero-shot datasets has been a long-term challenge for NER.
no code implementations • 16 Mar 2024 • Xuehao Wang, Feiyang Ye, Yu Zhang
Furthermore, we introduce modified SAM (mSAM) for multi-task learning where we remove the prompt encoder of SAM and use task-specific no mask embeddings and mask decoder for each task.
no code implementations • 17 Jan 2024 • Feiyang Ye, Baijiong Lin, Xiaofeng Cao, Yu Zhang, Ivor Tsang
In this paper, we study the Multi-Objective Bi-Level Optimization (MOBLO) problem, where the upper-level subproblem is a multi-objective optimization problem and the lower-level subproblem is for scalar optimization.
no code implementations • 8 Dec 2023 • Jinjing Zhu, Feiyang Ye, Qiao Xiao, Pengxin Guo, Yu Zhang, Qiang Yang
Specifically, the proposed LIWUDA method constructs a weight network to assign weights to each instance based on its probability of belonging to common classes, and designs Weighted Optimal Transport (WOT) for domain alignment by leveraging instance weights.
no code implementations • 30 Sep 2023 • Xiang Liu, Liangxi Liu, Feiyang Ye, Yunheng Shen, Xia Li, Linshan Jiang, Jialin Li
Efficiently aggregating trained neural networks from local clients into a global model on a server is a widely researched topic in federated learning.
1 code implementation • 23 Aug 2023 • Baijiong Lin, Weisen Jiang, Feiyang Ye, Yu Zhang, Pengguang Chen, Ying-Cong Chen, Shu Liu, James T. Kwok
Multi-task learning (MTL), a learning paradigm to learn multiple related tasks simultaneously, has achieved great success in various fields.
no code implementations • 16 May 2022 • Liang Huang, Senjie Liang, Feiyang Ye, Nan Gao
In this paper, we propose a Fast Attention Network (FAN) for joint intent detection and slot filling tasks, guaranteeing both accuracy and latency.
1 code implementation • 20 Nov 2021 • Baijiong Lin, Feiyang Ye, Yu Zhang, Ivor W. Tsang
Multi-Task Learning (MTL) has achieved success in various fields.
no code implementations • 20 Nov 2021 • Zhixiong Yue, Feiyang Ye, Yu Zhang, Christy Liang, Ivor W. Tsang
We theoretically study the safeness of both learning strategies in the DSMTL model to show that the proposed methods can achieve some versions of safe multi-task learning.
no code implementations • NeurIPS 2021 • Feiyang Ye, Baijiong Lin, Zhixiong Yue, Pengxin Guo, Qiao Xiao, Yu Zhang
Empirically, we show the effectiveness of the proposed MOML framework in several meta learning problems, including few-shot learning, neural architecture search, domain adaptation, and multi-task learning.