Non-separable Non-stationary random fields

We describe a framework for constructing non-separable non-stationary random fields that is based on an infinite mixture of convolved stochastic processes. When the mixing process is stationary and the convolution function is non-stationary we arrive at expressive kernels that are available in closed form. When the mixing is non-stationary and the convolution function is stationary the resulting random fields exhibit varying degrees of non-separability that better preserve local structure. These kernels have natural interpretations through corresponding stochastic differential equations (SDEs) and are demonstrated on a range of synthetic benchmarks and spatio-temporal applications in geostatistics and machine learning. We show how a single Gaussian process (GP) with these kernels can computationally and statistically outperform both separable and existing non-stationary non-separable approaches such as treed GPs and deep GP constructions.

PDF ICML 2020 PDF
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here