1 code implementation • 3 Mar 2023 • Dat Phan-Trong, Hung Tran-The, Sunil Gupta
Bayesian Optimization (BO) is an effective approach for global optimization of black-box functions when function evaluations are expensive.
2 code implementations • 1 Jan 2023 • Hien Dang, Tho Tran, Stanley Osher, Hung Tran-The, Nhat Ho, Tan Nguyen
Modern deep neural networks have achieved impressive performance on tasks from image classification to natural language processing.
no code implementations • 15 Mar 2022 • Hung Tran-The, Sunil Gupta, Santu Rana, Svetha Venkatesh
In particular, whether in the noisy setting, the EI strategy with a standard incumbent converges is still an open question of the Gaussian process bandit optimization problem.
no code implementations • 29 Sep 2021 • Hung Tran-The, Sunil Gupta, Santu Rana, Long Tran-Thanh, Svetha Venkatesh
With a linear reward function, we demonstrate that our algorithm achieves a near-optimal regret.
no code implementations • 24 Jul 2021 • Hung Tran-The, Sunil Gupta, Thanh Nguyen-Tang, Santu Rana, Svetha Venkatesh
We propose a novel approach that uses a hybrid of offline learning with online exploration.
no code implementations • 10 May 2021 • Hung Tran-The, Sunil Gupta, Santu Rana, Svetha Venkatesh
Bayesian optimisation (BO) is a well-known efficient algorithm for finding the global optimum of expensive, black-box functions.
no code implementations • 11 Mar 2021 • Thanh Nguyen-Tang, Sunil Gupta, Hung Tran-The, Svetha Venkatesh
To the best of our knowledge, this is the first theoretical characterization of the sample complexity of offline RL with deep neural network function approximation under the general Besov regularity condition that goes beyond {the linearity regime} in the traditional Reproducing Hilbert kernel spaces and Neural Tangent Kernels.
no code implementations • NeurIPS 2020 • Hung Tran-The, Sunil Gupta, Santu Rana, Huong Ha, Svetha Venkatesh
To this end, we propose a novel BO algorithm which expands (and shifts) the search space over iterations based on controlling the expansion rate thought a hyperharmonic series.
no code implementations • 27 Nov 2019 • Hung Tran-The, Sunil Gupta, Santu Rana, Svetha Venkatesh
Optimising acquisition function in low dimensional subspaces allows our method to obtain accurate solutions within limited computational budget.
1 code implementation • NeurIPS 2019 • Huong Ha, Santu Rana, Sunil Gupta, Thanh Nguyen, Hung Tran-The, Svetha Venkatesh
Applying Bayesian optimization in problems wherein the search space is unknown is challenging.