1 code implementation • 26 Feb 2024 • James T. Wilson
Bayesian optimization is a popular framework for efficiently finding high-quality solutions to difficult problems based on limited prior information.
1 code implementation • 2 Sep 2023 • Lucas Cosier, Rares Iordan, Sicelukwanda Zwane, Giovanni Franzese, James T. Wilson, Marc Peter Deisenroth, Alexander Terenin, Yasemin Bekiroglu
To control how a robot moves, motion planning algorithms must compute paths in high-dimensional state spaces while accounting for physical constraints related to motors and joints, generating smooth and stable motions, avoiding obstacles, and preventing collisions.
2 code implementations • 8 Nov 2020 • James T. Wilson, Viacheslav Borovitskiy, Alexander Terenin, Peter Mostowsky, Marc Peter Deisenroth
As Gaussian processes are used to answer increasingly complex questions, analytic solutions become scarcer and scarcer.
5 code implementations • ICML 2020 • James T. Wilson, Viacheslav Borovitskiy, Alexander Terenin, Peter Mostowsky, Marc Peter Deisenroth
Gaussian processes are the gold standard for many real-world modeling problems, especially in cases where a model's success hinges upon its ability to faithfully represent predictive uncertainty.
1 code implementation • NeurIPS 2018 • James T. Wilson, Frank Hutter, Marc Peter Deisenroth
Bayesian optimization is a sample-efficient approach to global optimization that relies on theoretically motivated value heuristics (acquisition functions) to guide its search process.
1 code implementation • 1 Dec 2017 • James T. Wilson, Riccardo Moriconi, Frank Hutter, Marc Peter Deisenroth
Bayesian optimization is a sample-efficient approach to solving global optimization problems.
no code implementations • 14 Jun 2015 • Wenlin Chen, James T. Wilson, Stephen Tyree, Kilian Q. Weinberger, Yixin Chen
Convolutional neural networks (CNN) are increasingly used in many areas of computer vision.
1 code implementation • 19 Apr 2015 • Wenlin Chen, James T. Wilson, Stephen Tyree, Kilian Q. Weinberger, Yixin Chen
As deep nets are increasingly used in applications suited for mobile devices, a fundamental dilemma becomes apparent: the trend in deep learning is to grow models to absorb ever-increasing data set sizes; however mobile devices are designed with very little memory and cannot store such large models.