Search Results for author: Hung Tran-The

Found 10 papers, 2 papers with code

Neural-BO: A Black-box Optimization Algorithm using Deep Neural Networks

no code implementations3 Mar 2023 Dat Phan-Trong, Hung Tran-The, Sunil Gupta

Bayesian Optimization (BO) is an effective approach for global optimization of black-box functions when function evaluations are expensive.

Bayesian Optimization Gaussian Processes

Neural Collapse in Deep Linear Networks: From Balanced to Imbalanced Data

2 code implementations1 Jan 2023 Hien Dang, Tho Tran, Stanley Osher, Hung Tran-The, Nhat Ho, Tan Nguyen

Modern deep neural networks have achieved impressive performance on tasks from image classification to natural language processing.

Image Classification

Regret Bounds for Expected Improvement Algorithms in Gaussian Process Bandit Optimization

no code implementations15 Mar 2022 Hung Tran-The, Sunil Gupta, Santu Rana, Svetha Venkatesh

In particular, whether in the noisy setting, the EI strategy with a standard incumbent converges is still an open question of the Gaussian process bandit optimization problem.

Open-Ended Question Answering

Bayesian Optimistic Optimisation with Exponentially Decaying Regret

no code implementations10 May 2021 Hung Tran-The, Sunil Gupta, Santu Rana, Svetha Venkatesh

Bayesian optimisation (BO) is a well-known efficient algorithm for finding the global optimum of expensive, black-box functions.

Bayesian Optimisation

Sample Complexity of Offline Reinforcement Learning with Deep ReLU Networks

no code implementations11 Mar 2021 Thanh Nguyen-Tang, Sunil Gupta, Hung Tran-The, Svetha Venkatesh

To the best of our knowledge, this is the first theoretical characterization of the sample complexity of offline RL with deep neural network function approximation under the general Besov regularity condition that goes beyond {the linearity regime} in the traditional Reproducing Hilbert kernel spaces and Neural Tangent Kernels.

Offline RL reinforcement-learning +1

Sub-linear Regret Bounds for Bayesian Optimisation in Unknown Search Spaces

no code implementations NeurIPS 2020 Hung Tran-The, Sunil Gupta, Santu Rana, Huong Ha, Svetha Venkatesh

To this end, we propose a novel BO algorithm which expands (and shifts) the search space over iterations based on controlling the expansion rate thought a hyperharmonic series.

Bayesian Optimisation

Trading Convergence Rate with Computational Budget in High Dimensional Bayesian Optimization

no code implementations27 Nov 2019 Hung Tran-The, Sunil Gupta, Santu Rana, Svetha Venkatesh

Optimising acquisition function in low dimensional subspaces allows our method to obtain accurate solutions within limited computational budget.

Vocal Bursts Intensity Prediction

Cannot find the paper you are looking for? You can Submit a new open access paper.