1 code implementation • 18 Apr 2024 • Masaki Adachi, Satoshi Hayakawa, Martin Jørgensen, Saad Hamid, Harald Oberhauser, Michael A. Osborne
Parallelisation in Bayesian optimisation is a common strategy but faces several challenges: the need for flexibility in acquisition functions and kernel choices, flexibility dealing with discrete and continuous variables simultaneously, model misspecification, and lastly fast massive parallelisation.
1 code implementation • 15 Mar 2023 • Saad Hamid, Xingchen Wan, Martin Jørgensen, Binxin Ru, Michael Osborne
Ensembling can improve the performance of Neural Networks, but existing approaches struggle when the architecture likelihood surface has dispersed, narrow peaks.
1 code implementation • 27 Jan 2023 • Masaki Adachi, Satoshi Hayakawa, Saad Hamid, Martin Jørgensen, Harald Oberhauser, Micheal A. Osborne
Batch Bayesian optimisation and Bayesian quadrature have been shown to be sample-efficient methods of performing optimisation and quadrature where expensive-to-evaluate objective functions can be queried in parallel.
no code implementations • 14 Jun 2021 • Saad Hamid, Sebastian Schulze, Michael A. Osborne, Stephen J. Roberts
Marginalising over families of Gaussian Process kernels produces flexible model classes with well-calibrated uncertainty estimates.
1 code implementation • 4 Dec 2018 • Ed Wagstaff, Saad Hamid, Michael Osborne
Integration over non-negative integrands is a central problem in machine learning (e. g. for model averaging, (hyper-)parameter marginalisation, and computing posterior predictive distributions).