no code implementations • 17 Jan 2024 • Feiyang Ye, Baijiong Lin, Xiaofeng Cao, Yu Zhang, Ivor Tsang
In this paper, we study the Multi-Objective Bi-Level Optimization (MOBLO) problem, where the upper-level subproblem is a multi-objective optimization problem and the lower-level subproblem is for scalar optimization.
no code implementations • 3 Oct 2023 • Weisen Jiang, Baijiong Lin, Han Shi, Yu Zhang, Zhenguo Li, James T. Kwok
Recently, various merging methods have been proposed to build a multi-task model from task-specific finetuned models without retraining.
1 code implementation • 23 Aug 2023 • Baijiong Lin, Weisen Jiang, Feiyang Ye, Yu Zhang, Pengguang Chen, Ying-Cong Chen, Shu Liu, James T. Kwok
Multi-task learning (MTL), a learning paradigm to learn multiple related tasks simultaneously, has achieved great success in various fields.
no code implementations • 23 Aug 2023 • Xiyu Wang, Baijiong Lin, Daochang Liu, Chang Xu
Diffusion Probabilistic Models (DPMs) have demonstrated substantial promise in image generation tasks but heavily rely on the availability of large amounts of training data.
1 code implementation • 27 Mar 2022 • Baijiong Lin, Yu Zhang
This paper presents LibMTL, an open-source Python library built on PyTorch, which provides a unified, comprehensive, reproducible, and extensible implementation framework for Multi-Task Learning (MTL).
1 code implementation • 20 Nov 2021 • Baijiong Lin, Feiyang Ye, Yu Zhang, Ivor W. Tsang
Multi-Task Learning (MTL) has achieved success in various fields.
no code implementations • NeurIPS 2021 • Feiyang Ye, Baijiong Lin, Zhixiong Yue, Pengxin Guo, Qiao Xiao, Yu Zhang
Empirically, we show the effectiveness of the proposed MOML framework in several meta learning problems, including few-shot learning, neural architecture search, domain adaptation, and multi-task learning.
no code implementations • 19 Nov 2020 • Pengxin Guo, Yuancheng Xu, Baijiong Lin, Yu Zhang
More specifically, MTA uses a generator for adversarial perturbations which consists of a shared encoder for all tasks and multiple task-specific decoders.
no code implementations • 19 Nov 2020 • Zhixiong Yue, Baijiong Lin, Xiaonan Huang, Yu Zhang
Although NAS methods can find network architectures with the state-of-the-art performance, the adversarial robustness and resource constraint are often ignored in NAS.