no code implementations • 23 Jan 2024 • Nasim Soltani, Jifan Zhang, Batool Salehi, Debashri Roy, Robert Nowak, Kaushik Chowdhury
We evaluate the performance of different active learning algorithms on a publicly available multi-modal dataset with different modalities including image and LiDAR.
no code implementations • 12 Jan 2024 • Gantavya Bhatt, Yifang Chen, Arnav M. Das, Jifan Zhang, Sang T. Truong, Stephen Mussmann, Yinglun Zhu, Jeffrey Bilmes, Simon S. Du, Kevin Jamieson, Jordan T. Ash, Robert D. Nowak
To mitigate the annotation cost of SFT and circumvent the computational bottlenecks of active learning, we propose using experimental design.
no code implementations • 14 Dec 2023 • Shyam Nuggehalli, Jifan Zhang, Lalit Jain, Robert Nowak
Our results demonstrate that DIRECT can save more than 60% of the annotation budget compared to state-of-art active learning algorithms and more than 80% of annotation budget compared to random sampling.
1 code implementation • 16 Jun 2023 • Jifan Zhang, Yifang Chen, Gregory Canal, Stephen Mussmann, Arnav M. Das, Gantavya Bhatt, Yinglun Zhu, Jeffrey Bilmes, Simon Shaolei Du, Kevin Jamieson, Robert D Nowak
Labeled data are critical to modern machine learning applications, but obtaining labels can be expensive.
1 code implementation • NeurIPS 2023 • Jifan Zhang, Shuai Shao, Saurabh Verma, Robert Nowak
To address this, we propose the first adaptive algorithm selection strategy for deep active learning.
no code implementations • 6 Oct 2022 • Liu Yang, Jifan Zhang, Joseph Shenouda, Dimitris Papailiopoulos, Kangwook Lee, Robert D. Nowak
Weight decay is one of the most widely used forms of regularization in deep learning, and has been shown to improve generalization and robustness.
1 code implementation • 3 Feb 2022 • Jifan Zhang, Julian Katz-Samuels, Robert Nowak
Active learning is a label-efficient approach to train highly effective models while interactively selecting only small subsets of unlabelled data for labelling and training.
no code implementations • 13 May 2021 • Julian Katz-Samuels, Jifan Zhang, Lalit Jain, Kevin Jamieson
We consider active learning for binary classification in the agnostic pool-based setting.
no code implementations • 29 Oct 2020 • Jifan Zhang, Lalit Jain, Kevin Jamieson
Unlike the design of traditional adaptive algorithms that rely on concentration of measure and careful analysis to justify the correctness and sample complexity of the procedure, our adaptive algorithm is learned via adversarial training over equivalence classes of problems derived from information theoretic lower bounds.