Bayesian Optimisation
86 papers with code • 0 benchmarks • 0 datasets
Expensive black-box functions are a common problem in many disciplines, including tuning the parameters of machine learning algorithms, robotics, and other engineering design problems. Bayesian Optimisation is a principled and efficient technique for the global optimisation of these functions. The idea behind Bayesian Optimisation is to place a prior distribution over the target function and then update that prior with a set of “true” observations of the target function by expensively evaluating it in order to produce a posterior predictive distribution. The posterior then informs where to make the next observation of the target function through the use of an acquisition function, which balances the exploitation of regions known to have good performance with the exploration of regions where there is little information about the function’s response.
Source: A Bayesian Approach for the Robust Optimisation of Expensive-to-Evaluate Functions
Benchmarks
These leaderboards are used to track progress in Bayesian Optimisation
Libraries
Use these libraries to find Bayesian Optimisation models and implementationsMost implemented papers
Efficient Bayesian Experimental Design for Implicit Models
Bayesian experimental design involves the optimal allocation of resources in an experiment, with the aim of optimising cost and performance.
Algorithmic Assurance: An Active Approach to Algorithmic Testing using Bayesian Optimisation
We address this problem by proposing an efficient framework for algorithmic testing.
Batch Selection for Parallelisation of Bayesian Quadrature
Integration over non-negative integrands is a central problem in machine learning (e. g. for model averaging, (hyper-)parameter marginalisation, and computing posterior predictive distributions).
Fitting A Mixture Distribution to Data: Tutorial
In explaining the main algorithm, first, fitting a mixture of two distributions is detailed and examples of fitting two Gaussians and Poissons, respectively for continuous and discrete cases, are introduced.
Asynchronous Batch Bayesian Optimisation with Improved Local Penalisation
Batch Bayesian optimisation (BO) has been successfully applied to hyperparameter tuning using parallel computing, but it is wasteful of resources: workers that complete jobs ahead of others are left idle.
Gaussian Process Priors for Dynamic Paired Comparison Modelling
Dynamic paired comparison models, such as Elo and Glicko, are frequently used for sports prediction and ranking players or teams.
Tuning Hyperparameters without Grad Students: Scalable and Robust Bayesian Optimisation with Dragonfly
We compare Dragonfly to a suite of other packages and algorithms for global optimisation and demonstrate that when the above methods are integrated, they enable significant improvements in the performance of BO.
Effective Estimation of Deep Generative Language Models
We concentrate on one such model, the variational auto-encoder, which we argue is an important building block in hierarchical probabilistic models of language.
Parallel Gaussian process surrogate Bayesian inference with noisy likelihood evaluations
We consider Bayesian inference when only a limited number of noisy log-likelihood evaluations can be obtained.
Fast and Reliable Architecture Selection for Convolutional Neural Networks
The performance of a Convolutional Neural Network (CNN) depends on its hyperparameters, like the number of layers, kernel sizes, or the learning rate for example.