Gaussian Processes
573 papers with code • 1 benchmarks • 5 datasets
Gaussian Processes is a powerful framework for several machine learning tasks such as regression, classification and inference. Given a finite set of input output training data that is generated out of a fixed (but possibly unknown) function, the framework models the unknown function as a stochastic process such that the training outputs are a finite number of jointly Gaussian random variables, whose properties can then be used to infer the statistics (the mean and variance) of the function at test values of input.
Libraries
Use these libraries to find Gaussian Processes models and implementationsMost implemented papers
Deep Convolutional Networks as shallow Gaussian Processes
For a CNN, the equivalent kernel can be computed exactly and, unlike "deep kernels", has very few parameters: only the hyperparameters of the original CNN.
GaussianProcesses.jl: A Nonparametric Bayes package for the Julia Language
Gaussian processes are a class of flexible nonparametric Bayesian tools that are widely used across the sciences, and in industry, to model complex data sources.
Functional Variational Bayesian Neural Networks
We introduce functional variational Bayesian neural networks (fBNNs), which maximize an Evidence Lower BOund (ELBO) defined directly on stochastic processes, i. e. distributions over functions.
Exact Gaussian Processes on a Million Data Points
Gaussian processes (GPs) are flexible non-parametric models, with a capacity that grows with the available data.
Deep Bayesian Optimization on Attributed Graphs
Attributed graphs, which contain rich contextual features beyond just network structure, are ubiquitous and have been observed to benefit various network analytics applications.
Bayesian Meta-Learning for the Few-Shot Setting via Deep Kernels
Recently, different machine learning methods have been introduced to tackle the challenging few-shot learning scenario that is, learning from a small labeled dataset related to a specific task.
PACOH: Bayes-Optimal Meta-Learning with PAC-Guarantees
Meta-learning can successfully acquire useful inductive biases from data.
How Good are Low-Rank Approximations in Gaussian Process Regression?
In particular, we bound the Kullback-Leibler divergence between an exact GP and one resulting from one of the afore-described low-rank approximations to its kernel, as well as between their corresponding predictive densities, and we also bound the error between predictive mean vectors and between predictive covariance matrices computed using the exact versus using the approximate GP.
Bayesian Deep Ensembles via the Neural Tangent Kernel
We explore the link between deep ensembles and Gaussian processes (GPs) through the lens of the Neural Tangent Kernel (NTK): a recent development in understanding the training dynamics of wide neural networks (NNs).
An Intuitive Tutorial to Gaussian Process Regression
This tutorial is accessible to a broad audience, including those new to machine learning, ensuring a clear understanding of GPR fundamentals.