Search Results for author: Andrew Delworth

Found 1 papers, 1 papers with code

Enhancing CLIP with CLIP: Exploring Pseudolabeling for Limited-Label Prompt Tuning

2 code implementations NeurIPS 2023 Cristina Menghini, Andrew Delworth, Stephen H. Bach

We find that (1) unexplored prompt tuning strategies that iteratively refine pseudolabels consistently improve CLIP accuracy, by 19. 5 points in semi-supervised learning, by 28. 4 points in transductive zero-shot learning, and by 15. 2 points in unsupervised learning, and (2) unlike conventional semi-supervised pseudolabeling, which exacerbates model biases toward classes with higher-quality pseudolabels, prompt tuning leads to a more equitable distribution of per-class accuracy.

Image Classification Zero-Shot Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.