Bayesian Optimisation

86 papers with code • 0 benchmarks • 0 datasets

Expensive black-box functions are a common problem in many disciplines, including tuning the parameters of machine learning algorithms, robotics, and other engineering design problems. Bayesian Optimisation is a principled and efficient technique for the global optimisation of these functions. The idea behind Bayesian Optimisation is to place a prior distribution over the target function and then update that prior with a set of “true” observations of the target function by expensively evaluating it in order to produce a posterior predictive distribution. The posterior then informs where to make the next observation of the target function through the use of an acquisition function, which balances the exploitation of regions known to have good performance with the exploration of regions where there is little information about the function’s response.

Source: A Bayesian Approach for the Robust Optimisation of Expensive-to-Evaluate Functions

Libraries

Use these libraries to find Bayesian Optimisation models and implementations
6 papers
2,963

Most implemented papers

Efficient Bayesian Experimental Design for Implicit Models

stevenkleinegesse/bedimplicit 23 Oct 2018

Bayesian experimental design involves the optimal allocation of resources in an experiment, with the aim of optimising cost and performance.

Batch Selection for Parallelisation of Bayesian Quadrature

OxfordML/bayesquad 4 Dec 2018

Integration over non-negative integrands is a central problem in machine learning (e. g. for model averaging, (hyper-)parameter marginalisation, and computing posterior predictive distributions).

Fitting A Mixture Distribution to Data: Tutorial

bghojogh/Fitting-Mixture-Distribution 20 Jan 2019

In explaining the main algorithm, first, fitting a mixture of two distributions is detailed and examples of fitting two Gaussians and Poissons, respectively for continuous and discrete cases, are introduced.

Asynchronous Batch Bayesian Optimisation with Improved Local Penalisation

a5a/asynchronous-BO 29 Jan 2019

Batch Bayesian optimisation (BO) has been successfully applied to hyperparameter tuning using parallel computing, but it is wasteful of resources: workers that complete jobs ahead of others are left idle.

Gaussian Process Priors for Dynamic Paired Comparison Modelling

martiningram/paired-comparison-gp-laplace 20 Feb 2019

Dynamic paired comparison models, such as Elo and Glicko, are frequently used for sports prediction and ranking players or teams.

Tuning Hyperparameters without Grad Students: Scalable and Robust Bayesian Optimisation with Dragonfly

dragonfly/dragonfly 15 Mar 2019

We compare Dragonfly to a suite of other packages and algorithms for global optimisation and demonstrate that when the above methods are integrated, they enable significant improvements in the performance of BO.

Effective Estimation of Deep Generative Language Models

tom-pelsmaeker/deep-generative-lm ACL 2020

We concentrate on one such model, the variational auto-encoder, which we argue is an important building block in hierarchical probabilistic models of language.

Parallel Gaussian process surrogate Bayesian inference with noisy likelihood evaluations

mjarvenpaa/parallel-GP-SL 3 May 2019

We consider Bayesian inference when only a limited number of noisy log-likelihood evaluations can be obtained.

Fast and Reliable Architecture Selection for Convolutional Neural Networks

pdefraene/cgp_cnn_predictors 6 May 2019

The performance of a Convolutional Neural Network (CNN) depends on its hyperparameters, like the number of layers, kernel sizes, or the learning rate for example.