Bayesian Optimisation

84 papers with code • 0 benchmarks • 0 datasets

Expensive black-box functions are a common problem in many disciplines, including tuning the parameters of machine learning algorithms, robotics, and other engineering design problems. Bayesian Optimisation is a principled and efficient technique for the global optimisation of these functions. The idea behind Bayesian Optimisation is to place a prior distribution over the target function and then update that prior with a set of “true” observations of the target function by expensively evaluating it in order to produce a posterior predictive distribution. The posterior then informs where to make the next observation of the target function through the use of an acquisition function, which balances the exploitation of regions known to have good performance with the exploration of regions where there is little information about the function’s response.

Source: A Bayesian Approach for the Robust Optimisation of Expensive-to-Evaluate Functions

Libraries

Use these libraries to find Bayesian Optimisation models and implementations
6 papers
2,938

Most implemented papers

Max-value Entropy Search for Efficient Bayesian Optimization

zi-w/Max-value-Entropy-Search ICML 2017

We propose a new criterion, Max-value Entropy Search (MES), that instead uses the information about the maximum function value.

HEBO Pushing The Limits of Sample-Efficient Hyperparameter Optimisation

huawei-noah/hebo 7 Dec 2020

Our results on the Bayesmark benchmark indicate that heteroscedasticity and non-stationarity pose significant challenges for black-box optimisers.

Developing Optimal Causal Cyber-Defence Agents via Cyber Security Simulation

dstl/yawning-titan 25 Jul 2022

In this paper we explore cyber security defence, through the unification of a novel cyber security simulator with models for (causal) decision-making through optimisation.

Bayesian optimisation for fast approximate inference in state-space models with intractable likelihoods

compops/gpo-smc-abc 23 Jun 2015

We consider the problem of approximate Bayesian parameter inference in non-linear state-space models with intractable likelihoods.

Bayesian Optimisation over Multiple Continuous and Categorical Inputs

rubinxin/CoCaBO_code ICML 2020

Efficient optimisation of black-box problems that comprise both continuous and categorical inputs is important, yet poses significant challenges.

On the Expressiveness of Approximate Inference in Bayesian Neural Networks

cambridge-mlg/expressiveness-approx-bnns NeurIPS 2020

While Bayesian neural networks (BNNs) hold the promise of being flexible, well-calibrated statistical models, inference often requires approximations whose consequences are poorly understood.

BoTorch: A Framework for Efficient Monte-Carlo Bayesian Optimization

pytorch/botorch NeurIPS 2020

Bayesian optimization provides sample-efficient global optimization for a broad range of applications, including automatic machine learning, engineering, physics, and experimental design.

Neural Architecture Generator Optimization

huawei-noah/vega NeurIPS 2020

Neural Architecture Search (NAS) was first proposed to achieve state-of-the-art performance through the discovery of new architecture patterns, without human intervention.

High-Dimensional Bayesian Optimisation with Variational Autoencoders and Deep Metric Learning

huawei-noah/hebo 7 Jun 2021

We introduce a method combining variational autoencoders (VAEs) and deep metric learning to perform Bayesian optimisation (BO) over high-dimensional and structured input spaces.

Are Random Decompositions all we need in High Dimensional Bayesian Optimisation?

huawei-noah/hebo 30 Jan 2023

Learning decompositions of expensive-to-evaluate black-box functions promises to scale Bayesian optimisation (BO) to high-dimensional problems.