Gaussian Processes

573 papers with code • 1 benchmarks • 5 datasets

Gaussian Processes is a powerful framework for several machine learning tasks such as regression, classification and inference. Given a finite set of input output training data that is generated out of a fixed (but possibly unknown) function, the framework models the unknown function as a stochastic process such that the training outputs are a finite number of jointly Gaussian random variables, whose properties can then be used to infer the statistics (the mean and variance) of the function at test values of input.

Source: Sequential Randomized Matrix Factorization for Gaussian Processes: Efficient Predictions and Hyper-parameter Optimization

Libraries

Use these libraries to find Gaussian Processes models and implementations

Subtasks


Most implemented papers

Deep Convolutional Networks as shallow Gaussian Processes

rhaps0dy/convnets-as-gps ICLR 2019

For a CNN, the equivalent kernel can be computed exactly and, unlike "deep kernels", has very few parameters: only the hyperparameters of the original CNN.

GaussianProcesses.jl: A Nonparametric Bayes package for the Julia Language

STOR-i/GaussianProcesses.jl 21 Dec 2018

Gaussian processes are a class of flexible nonparametric Bayesian tools that are widely used across the sciences, and in industry, to model complex data sources.

Functional Variational Bayesian Neural Networks

ssydasheng/FBNN ICLR 2019

We introduce functional variational Bayesian neural networks (fBNNs), which maximize an Evidence Lower BOund (ELBO) defined directly on stochastic processes, i. e. distributions over functions.

Exact Gaussian Processes on a Million Data Points

cornellius-gp/gpytorch NeurIPS 2019

Gaussian processes (GPs) are flexible non-parametric models, with a capacity that grows with the available data.

Deep Bayesian Optimization on Attributed Graphs

csjtx1021/DGBO 31 May 2019

Attributed graphs, which contain rich contextual features beyond just network structure, are ubiquitous and have been observed to benefit various network analytics applications.

Bayesian Meta-Learning for the Few-Shot Setting via Deep Kernels

BayesWatch/deep-kernel-transfer NeurIPS 2020

Recently, different machine learning methods have been introduced to tackle the challenging few-shot learning scenario that is, learning from a small labeled dataset related to a specific task.

How Good are Low-Rank Approximations in Gaussian Process Regression?

aresPanos/DMGP_regression 3 Apr 2020

In particular, we bound the Kullback-Leibler divergence between an exact GP and one resulting from one of the afore-described low-rank approximations to its kernel, as well as between their corresponding predictive densities, and we also bound the error between predictive mean vectors and between predictive covariance matrices computed using the exact versus using the approximate GP.

Bayesian Deep Ensembles via the Neural Tangent Kernel

bobby-he/bayesian-ntk NeurIPS 2020

We explore the link between deep ensembles and Gaussian processes (GPs) through the lens of the Neural Tangent Kernel (NTK): a recent development in understanding the training dynamics of wide neural networks (NNs).

An Intuitive Tutorial to Gaussian Process Regression

jwangjie/gaussian-process-regression-tutorial 22 Sep 2020

This tutorial is accessible to a broad audience, including those new to machine learning, ensuring a clear understanding of GPR fundamentals.