Search Results for author: Zhengbo Wang

Found 6 papers, 4 papers with code

A Hard-to-Beat Baseline for Training-free CLIP-based Adaptation

1 code implementation6 Feb 2024 Zhengbo Wang, Jian Liang, Lijun Sheng, Ran He, Zilei Wang, Tieniu Tan

Extensive results on 17 datasets validate that our method surpasses or achieves comparable results with state-of-the-art methods on few-shot classification, imbalanced learning, and out-of-distribution generalization.

Out-of-Distribution Generalization

Connecting the Dots: Collaborative Fine-tuning for Black-Box Vision-Language Models

no code implementations6 Feb 2024 Zhengbo Wang, Jian Liang, Ran He, Zilei Wang, Tieniu Tan

This paper proposes a \textbf{C}ollabo\textbf{ra}tive \textbf{F}ine-\textbf{T}uning (\textbf{CraFT}) approach for fine-tuning black-box VLMs to downstream tasks, where one only has access to the input prompts and the output predictions of the model.

Self-training solutions for the ICCV 2023 GeoNet Challenge

1 code implementation28 Nov 2023 Lijun Sheng, Zhengbo Wang, Jian Liang

Our solution adopts a two-stage source-free domain adaptation framework with a Swin Transformer backbone to achieve knowledge transfer from the USA (source) domain to Asia (target) domain.

Source-Free Domain Adaptation Transfer Learning

Towards Realistic Unsupervised Fine-tuning with CLIP

no code implementations24 Aug 2023 Jian Liang, Lijun Sheng, Zhengbo Wang, Ran He, Tieniu Tan

The emergence of vision-language models (VLMs), such as CLIP, has spurred a significant research effort towards their application for downstream supervised learning tasks.

Out-of-Distribution Detection

Exploiting Semantic Attributes for Transductive Zero-Shot Learning

1 code implementation17 Mar 2023 Zhengbo Wang, Jian Liang, Zilei Wang, Tieniu Tan

To address this issue, we present a novel transductive ZSL method that produces semantic attributes of the unseen data and imposes them on the generative process.

Attribute Generative Adversarial Network +1

Cannot find the paper you are looking for? You can Submit a new open access paper.