no code implementations • 17 Apr 2016 • Arno Solin, Pasi Jylänki, Jaakko Kauramäki, Tom Heskes, Marcel A. J. van Gerven, Simo Särkkä
We apply the method to both simulated and empirical data, and demonstrate the efficiency and generality of our Bayesian source reconstruction approach which subsumes various classical approaches in the literature.
2 code implementations • 16 Dec 2014 • Aki Vehtari, Andrew Gelman, Tuomas Sivula, Pasi Jylänki, Dustin Tran, Swupnil Sahai, Paul Blomstedt, John P. Cunningham, David Schiminovich, Christian Robert
A common divide-and-conquer approach for Bayesian computation with big data is to partition the data, perform local inference for each piece separately, and combine the results to obtain a global posterior approximation.
no code implementations • 22 Apr 2014 • Ville Tolvanen, Pasi Jylänki, Aki Vehtari
This paper presents a novel approach for approximate integration over the uncertainty of noise and signal variances in Gaussian process (GP) regression.
no code implementations • 27 Mar 2013 • Pasi Jylänki, Aapo Nummenmaa, Aki Vehtari
Comparisons are made to two alternative models with ARD priors: a Gaussian process with a NN covariance function and marginal maximum a posteriori estimates of the relevance parameters, and a NN with Markov chain Monte Carlo integration over all the unknown model parameters.
1 code implementation • 25 Jun 2012 • Jarno Vanhatalo, Jaakko Riihimäki, Jouni Hartikainen, Pasi Jylänki, Ville Tolvanen, Aki Vehtari
The prior over functions is defined implicitly by the mean and covariance function, which determine the smoothness and variability of the function.
no code implementations • NeurIPS 2009 • Jarno Vanhatalo, Pasi Jylänki, Aki Vehtari
In this work, we discuss the properties of a Gaussian process regression model with the Student-t likelihood and utilize the Laplace approximation for approximate inference.