About

Expensive black-box functions are a common problem in many disciplines, including tuning the parameters of machine learning algorithms, robotics, and other engineering design problems. Bayesian Optimisation is a principled and efficient technique for the global optimisation of these functions. The idea behind Bayesian Optimisation is to place a prior distribution over the target function and then update that prior with a set of “true” observations of the target function by expensively evaluating it in order to produce a posterior predictive distribution. The posterior then informs where to make the next observation of the target function through the use of an acquisition function, which balances the exploitation of regions known to have good performance with the exploration of regions where there is little information about the function’s response.

Source: A Bayesian Approach for the Robust Optimisation of Expensive-to-Evaluate Functions

Benchmarks

No evaluation results yet. Help compare methods by submit evaluation metrics.

Greatest papers with code

BoTorch: A Framework for Efficient Monte-Carlo Bayesian Optimization

NeurIPS 2020 pytorch/botorch

Bayesian optimization provides sample-efficient global optimization for a broad range of applications, including automatic machine learning, engineering, physics, and experimental design.

BAYESIAN OPTIMISATION

Batch Bayesian Optimization via Local Penalization

29 May 2015SheffieldML/GPyOpt

The approach assumes that the function of interest, $f$, is a Lipschitz continuous function.

BAYESIAN OPTIMISATION EFFICIENT EXPLORATION GAUSSIAN PROCESSES

Tuning Hyperparameters without Grad Students: Scalable and Robust Bayesian Optimisation with Dragonfly

15 Mar 2019dragonfly/dragonfly

We compare Dragonfly to a suite of other packages and algorithms for global optimisation and demonstrate that when the above methods are integrated, they enable significant improvements in the performance of BO.

BAYESIAN OPTIMISATION

Neural Architecture Generator Optimization

NeurIPS 2020 huawei-noah/vega

Neural Architecture Search (NAS) was first proposed to achieve state-of-the-art performance through the discovery of new architecture patterns, without human intervention.

BAYESIAN OPTIMISATION NEURAL ARCHITECTURE SEARCH

GPflowOpt: A Bayesian Optimization Library using TensorFlow

10 Nov 2017GPflow/GPflowOpt

A novel Python framework for Bayesian optimization known as GPflowOpt is introduced.

BAYESIAN OPTIMISATION GAUSSIAN PROCESSES

Neural Architecture Search with Bayesian Optimisation and Optimal Transport

NeurIPS 2018 kirthevasank/nasbot

A common use case for BO in machine learning is model selection, where it is not possible to analytically model the generalisation performance of a statistical model, and we resort to noisy and expensive training and validation procedures to choose the best model.

BAYESIAN OPTIMISATION MODEL SELECTION NEURAL ARCHITECTURE SEARCH

Effective Estimation of Deep Generative Language Models

ACL 2020 tom-pelsmaeker/deep-generative-lm

We concentrate on one such model, the variational auto-encoder, which we argue is an important building block in hierarchical probabilistic models of language.

BAYESIAN OPTIMISATION LANGUAGE MODELLING VARIATIONAL INFERENCE

Asynchronous Parallel Bayesian Optimisation via Thompson Sampling

25 May 2017kirthevasank/gp-parallel-ts

We design and analyse variations of the classical Thompson sampling (TS) procedure for Bayesian optimisation (BO) in settings where function evaluations are expensive, but can be performed in parallel.

BAYESIAN OPTIMISATION

BayesOpt Adversarial Attack

ICLR 2020 rubinxin/BayesOpt_Attack

Black-box adversarial attacks require a large number of attempts before finding successful adversarial examples that are visually indistinguishable from the original input.

ADVERSARIAL ATTACK BAYESIAN OPTIMISATION DIMENSIONALITY REDUCTION MODEL SELECTION

Bayesian Optimisation over Multiple Continuous and Categorical Inputs

ICML 2020 rubinxin/CoCaBO_code

Efficient optimisation of black-box problems that comprise both continuous and categorical inputs is important, yet poses significant challenges.

BAYESIAN OPTIMISATION MULTI-ARMED BANDITS