Search Results for author: Mike Kirby

Found 4 papers, 3 papers with code

Functional Bayesian Tucker Decomposition for Continuous-indexed Tensor Data

1 code implementation8 Nov 2023 Shikai Fang, Xin Yu, Zheng Wang, Shibo Li, Mike Kirby, Shandian Zhe

To generalize Tucker decomposition to such scenarios, we propose Functional Bayesian Tucker Decomposition (FunBaT).

Gaussian Processes

Multi-Resolution Active Learning of Fourier Neural Operators

1 code implementation29 Sep 2023 Shibo Li, Xin Yu, Wei Xing, Mike Kirby, Akil Narayan, Shandian Zhe

To overcome this problem, we propose Multi-Resolution Active learning of FNO (MRA-FNO), which can dynamically select the input functions and resolutions to lower the data cost as much as possible while optimizing the learning efficiency.

Active Learning LEMMA +2

Multi-Fidelity Bayesian Optimization via Deep Neural Networks

no code implementations NeurIPS 2020 Shibo Li, Wei Xing, Mike Kirby, Shandian Zhe

In many applications, the objective function can be evaluated at multiple fidelities to enable a trade-off between the cost and accuracy.

Bayesian Optimization

Scalable Variational Gaussian Process Regression Networks

2 code implementations25 Mar 2020 Shibo Li, Wei Xing, Mike Kirby, Shandian Zhe

Gaussian process regression networks (GPRN) are powerful Bayesian models for multi-output regression, but their inference is intractable.

regression Variational Inference

Cannot find the paper you are looking for? You can Submit a new open access paper.