Search Results for author: Didi Zhu

Found 9 papers, 0 papers with code

Improving Group Connectivity for Generalization of Federated Deep Learning

no code implementations29 Feb 2024 Zexi Li, Jie Lin, Zhiqi Li, Didi Zhu, Chao Wu

Bridging the gap between LMC and FL, in this paper, we leverage fixed anchor models to empirically and theoretically study the transitivity property of connectivity from two models (LMC) to a group of models (model fusion in FL).

Federated Learning Linear Mode Connectivity

Model Tailor: Mitigating Catastrophic Forgetting in Multi-modal Large Language Models

no code implementations19 Feb 2024 Didi Zhu, Zhongyi Sun, Zexi Li, Tao Shen, Ke Yan, Shouhong Ding, Kun Kuang, Chao Wu

Catastrophic forgetting emerges as a critical challenge when fine-tuning multi-modal large language models (MLLMs), where improving performance on unseen tasks often leads to a significant performance drop on the original tasks.

Image Captioning Question Answering +1

RESMatch: Referring Expression Segmentation in a Semi-Supervised Manner

no code implementations8 Feb 2024 Ying Zang, Chenglong Fu, Runlong Cao, Didi Zhu, Min Zhang, WenJun Hu, Lanyun Zhu, Tianrun Chen

This pioneering work lays the groundwork for future research in semi-supervised learning for referring expression segmentation.

Image Segmentation Pseudo Label +5

Understanding Prompt Tuning for V-L Models Through the Lens of Neural Collapse

no code implementations28 Jun 2023 Didi Zhu, Zexi Li, Min Zhang, Junkun Yuan, Yunfeng Shao, Jiashuo Liu, Kun Kuang, Yinchuan Li, Chao Wu

It is found that NC optimality of text-to-image representations shows a positive correlation with downstream generalizability, which is more severe under class imbalance settings.

Quantitatively Measuring and Contrastively Exploring Heterogeneity for Domain Generalization

no code implementations25 May 2023 Yunze Tong, Junkun Yuan, Min Zhang, Didi Zhu, Keli Zhang, Fei Wu, Kun Kuang

With contrastive learning, we propose a learning potential-guided metric for domain heterogeneity by promoting learning variant features.

Contrastive Learning Domain Generalization

Generalized Universal Domain Adaptation with Generative Flow Networks

no code implementations8 May 2023 Didi Zhu, Yinchuan Li, Yunfeng Shao, Jianye Hao, Fei Wu, Kun Kuang, Jun Xiao, Chao Wu

We introduce a new problem in unsupervised domain adaptation, termed as Generalized Universal Domain Adaptation (GUDA), which aims to achieve precise prediction of all target labels including unknown categories.

Universal Domain Adaptation Unsupervised Domain Adaptation

Universal Domain Adaptation via Compressive Attention Matching

no code implementations ICCV 2023 Didi Zhu, Yincuan Li, Junkun Yuan, Zexi Li, Kun Kuang, Chao Wu

To address this issue, we propose a Universal Attention Matching (UniAM) framework by exploiting the self-attention mechanism in vision transformer to capture the crucial object information.

Universal Domain Adaptation

Towards Effective Clustered Federated Learning: A Peer-to-peer Framework with Adaptive Neighbor Matching

no code implementations23 Mar 2022 Zexi Li, Jiaxun Lu, Shuang Luo, Didi Zhu, Yunfeng Shao, Yinchuan Li, Zhimeng Zhang, Yongheng Wang, Chao Wu

In the literature, centralized clustered FL algorithms require the assumption of the number of clusters and hence are not effective enough to explore the latent relationships among clients.

Federated Learning

Ensemble Federated Adversarial Training with Non-IID data

no code implementations26 Oct 2021 Shuang Luo, Didi Zhu, Zexi Li, Chao Wu

Despite federated learning endows distributed clients with a cooperative training mode under the premise of protecting data privacy and security, the clients are still vulnerable when encountering adversarial samples due to the lack of robustness.

Federated Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.