1 code implementation • 6 Jul 2023 • Kenza Tazi, Jihao Andreas Lin, Ross Viljoen, Alex Gardner, ST John, Hong Ge, Richard E. Turner
Gaussian Processes (GPs) offer an attractive method for regression over small, structured and correlated datasets.
no code implementations • 28 May 2023 • Yongchao Huang, Yuhang He, Hong Ge
In this work, we introduce a novel framework which combines physics and machine learning methods to analyse acoustic signals.
1 code implementation • 25 May 2023 • Wenlin Chen, Hong Ge
This work examines the characteristic activation values of individual ReLU units in neural networks.
no code implementations • 22 Nov 2022 • Adrian Goldwaser, Hong Ge
However, these theoretical tools cannot fully explain finite networks as the empirical kernel changes significantly during gradient-descent-based training in contrast to infinite networks.
no code implementations • 21 Nov 2022 • Shaohua Zhi, Yinghui Wang, Haonan Xiao, Ti Bai, Hong Ge, Bing Li, Chenyang Liu, Wen Li, Tian Li, Jing Cai
Four-dimensional magnetic resonance imaging (4D-MRI) is an emerging technique for tumor motion management in image-guided radiation therapy (IGRT).
1 code implementation • 14 Oct 2022 • Alexander Terenin, David R. Burt, Artem Artemev, Seth Flaxman, Mark van der Wilk, Carl Edward Rasmussen, Hong Ge
For low-dimensional tasks such as geospatial modeling, we propose an automated method for computing inducing points satisfying these conditions.
2 code implementations • 7 Feb 2020 • Mohamed Tarek, Kai Xu, Martin Trapp, Hong Ge, Zoubin Ghahramani
Since DynamicPPL is a modular, stand-alone library, any probabilistic programming system written in Julia, such as Turing. jl, can use DynamicPPL to specify models and trace their model parameters.
1 code implementation • pproximateinference AABI Symposium 2019 • Tor Erlend Fjelde, Kai Xu, Mohamed Tarek, Sharan Yalburgi, Hong Ge
Transforming one probability distribution to another is a powerful tool in Bayesian inference and machine learning.
1 code implementation • pproximateinference AABI Symposium 2019 • Kai Xu, Hong Ge, Will Tebbutt, Mohamed Tarek, Martin Trapp, Zoubin Ghahramani
Stan's Hamilton Monte Carlo (HMC) has demonstrated remarkable sampling robustness and efficiency in a wide range of Bayesian inference problems through carefully crafted adaption schemes to the celebrated No-U-Turn sampler (NUTS) algorithm.
no code implementations • pproximateinference AABI Symposium 2019 • Philipp Gabler, Martin Trapp, Hong Ge, Franz Pernkopf
Many modern machine learning algorithms, such as automatic differentiation (AD) and versions of approximate Bayesian inference, can be understood as a particular case of message passing on some computation graph.
1 code implementation • NeurIPS 2019 • Martin Trapp, Robert Peharz, Hong Ge, Franz Pernkopf, Zoubin Ghahramani
While parameter learning in SPNs is well developed, structure learning leaves something to be desired: Even though there is a plethora of SPN structure learners, most of them are somewhat ad-hoc and based on intuition rather than a clear learning principle.
no code implementations • NeurIPS 2015 • Nilesh Tripuraneni, Shixiang (Shane) Gu, Hong Ge, Zoubin Ghahramani
Infinite Hidden Markov Models (iHMM's) are an attractive, nonparametric generalization of the classical Hidden Markov Model which can automatically infer the number of hidden states in the system.
no code implementations • 16 Sep 2015 • Hong Ge, Yarin Gal, Zoubin Ghahramani
In this paper, first we review the theory of random fragmentation processes [Bertoin, 2006], and a number of existing methods for modelling trees, including the popular nested Chinese restaurant process (nCRP).
no code implementations • 3 May 2015 • Nilesh Tripuraneni, Shane Gu, Hong Ge, Zoubin Ghahramani
Infinite Hidden Markov Models (iHMM's) are an attractive, nonparametric generalization of the classical Hidden Markov Model which can automatically infer the number of hidden states in the system.