no code implementations • 21 Feb 2024 • Yun-Shiuan Chuang, Jerry Zhu, Timothy T. Rogers
Whereas cognitive models of learning often assume direct experience with both the features of an event and with a true label or outcome, much of everyday learning arises from hearing the opinions of others, without direct access to either the experience or the ground truth outcome.
no code implementations • 11 Jun 2021 • Xuezhou Zhang, Yiding Chen, Jerry Zhu, Wen Sun
Surprisingly, in this case, the knowledge of $\epsilon$ is necessary, as we show that being adaptive to unknown $\epsilon$ is impossible. This again contrasts with recent results on corruption-robust online RL and implies that robust offline RL is a strictly harder problem.
no code implementations • 11 Jun 2021 • Zaynah Javed, Daniel S. Brown, Satvik Sharma, Jerry Zhu, Ashwin Balakrishna, Marek Petrik, Anca D. Dragan, Ken Goldberg
Results suggest that PG-BROIL can produce a family of behaviors ranging from risk-neutral to risk-averse and outperforms state-of-the-art imitation learning algorithms when learning from ambiguous demonstrations by hedging against uncertainty, rather than seeking to uniquely identify the demonstrator's reward function.
no code implementations • NeurIPS 2016 • Tzu-Kuo Huang, Lihong Li, Ara Vartanian, Saleema Amershi, Jerry Zhu
We present a theoretical analysis of active learning with more realistic interactions with human oracles.
no code implementations • NeurIPS 2015 • Kwang-Sung Jun, Jerry Zhu, Timothy T. Rogers, Zhuoran Yang, Ming Yuan
In this paper, we propose the first efficient maximum likelihood estimate (MLE) for INVITE by decomposing the censored output into a series of absorbing random walks.
no code implementations • NeurIPS 2014 • Kaustubh R. Patil, Jerry Zhu, Łukasz Kopeć, Bradley C. Love
We apply a machine teaching procedure to a cognitive model that is either limited capacity (as humans are) or unlimited capacity (as most machine learning systems are).
no code implementations • NeurIPS 2013 • Jerry Zhu
What if there is a teacher who knows the learning goal and wants to design good training data for a machine learner?
no code implementations • NeurIPS 2011 • Shilin Ding, Grace Wahba, Jerry Zhu
In discrete undirected graphical models, the conditional independence of node labels Y is specified by the graph structure.
no code implementations • NeurIPS 2011 • Faisal Khan, Bilge Mutlu, Jerry Zhu
We study the empirical strategies that humans follow as they teach a target concept with a simple 1D threshold to a robot.
no code implementations • NeurIPS 2010 • Tim Rogers, Chuck Kalish, Joseph Harrison, Jerry Zhu, Bryan R. Gibson
When the distribution of unlabeled data in feature space lies along a manifold, the information it provides may be used by a learner to assist classification in a semi-supervised setting.
no code implementations • NeurIPS 2010 • Andrew Goldberg, Ben Recht, Jun-Ming Xu, Robert Nowak, Jerry Zhu
We pose transductive classification as a matrix completion problem.
no code implementations • NeurIPS 2009 • Jerry Zhu, Bryan R. Gibson, Timothy T. Rogers
We propose to use Rademacher complexity, originally developed in computational learning theory, as a measure of human learning capacity.
no code implementations • NeurIPS 2008 • Rui M. Castro, Charles Kalish, Robert Nowak, Ruichen Qian, Tim Rogers, Jerry Zhu
We investigate a topic at the interface of machine learning and cognitive science.
no code implementations • NeurIPS 2008 • Aarti Singh, Robert Nowak, Jerry Zhu
We show that there are large classes of problems for which SSL can significantly outperform supervised learning, in finite sample regimes and sometimes also in terms of error convergence rates.