Search Results for author: Ingmar Schuster

Found 14 papers, 5 papers with code

Autoregressive Denoising Diffusion Models for Multivariate Probabilistic Time Series Forecasting

1 code implementation28 Jan 2021 Kashif Rasul, Calvin Seward, Ingmar Schuster, Roland Vollgraf

In this work, we propose \texttt{TimeGrad}, an autoregressive model for multivariate probabilistic time series forecasting which samples from the data distribution at each time step by estimating its gradient.

Multivariate Time Series Forecasting Probabilistic Time Series Forecasting +1

Feature space approximation for kernel-based supervised learning

1 code implementation25 Nov 2020 Patrick Gelß, Stefan Klus, Ingmar Schuster, Christof Schütte

We propose a method for the approximation of high- or even infinite-dimensional feature vectors, which play an important role in supervised learning.

regression Time Series +1

Multivariate Probabilistic Time Series Forecasting via Conditioned Normalizing Flows

1 code implementation ICLR 2021 Kashif Rasul, Abdul-Saboor Sheikh, Ingmar Schuster, Urs Bergmann, Roland Vollgraf

In this work we model the multivariate temporal dynamics of time series via an autoregressive deep learning model, where the data distribution is represented by a conditioned normalizing flow.

Decision Making Multivariate Time Series Forecasting +3

A Rigorous Theory of Conditional Mean Embeddings

no code implementations2 Dec 2019 Ilja Klebanov, Ingmar Schuster, T. J. Sullivan

Conditional mean embeddings (CMEs) have proven themselves to be a powerful tool in many machine learning applications.

Set Flow: A Permutation Invariant Normalizing Flow

no code implementations6 Sep 2019 Kashif Rasul, Ingmar Schuster, Roland Vollgraf, Urs Bergmann

We present a generative model that is defined on finite sets of exchangeable, potentially high dimensional, data.

Kernel Conditional Density Operators

no code implementations27 May 2019 Ingmar Schuster, Mattes Mollenhauer, Stefan Klus, Krikamol Muandet

The proposed model is based on a novel approach to the reconstruction of probability densities from their kernel mean embeddings by drawing connections to estimation of Radon-Nikodym derivatives in the reproducing kernel Hilbert space (RKHS).

Density Estimation Gaussian Processes

A kernel-based approach to molecular conformation analysis

no code implementations28 Sep 2018 Stefan Klus, Andreas Bittracher, Ingmar Schuster, Christof Schütte

We present a novel machine learning approach to understanding conformation dynamics of biomolecules.

BIG-bench Machine Learning

Singular Value Decomposition of Operators on Reproducing Kernel Hilbert Spaces

no code implementations24 Jul 2018 Mattes Mollenhauer, Ingmar Schuster, Stefan Klus, Christof Schütte

Reproducing kernel Hilbert spaces (RKHSs) play an important role in many statistics and machine learning applications ranging from support vector machines to Gaussian processes and kernel embeddings of distributions.

Gaussian Processes

Markov Chain Importance Sampling -- a highly efficient estimator for MCMC

no code implementations18 May 2018 Ingmar Schuster, Ilja Klebanov

As a by-product it enables estimating the normalizing constant, an important quantity in Bayesian machine learning and statistics.

BIG-bench Machine Learning

Analyzing high-dimensional time-series data using kernel transfer operator eigenfunctions

no code implementations16 May 2018 Stefan Klus, Sebastian Peitz, Ingmar Schuster

Kernel transfer operators, which can be regarded as approximations of transfer operators such as the Perron-Frobenius or Koopman operator in reproducing kernel Hilbert spaces, are defined in terms of covariance and cross-covariance operators and have been shown to be closely related to the conditional mean embedding framework developed by the machine learning community.

Time Series Time Series Analysis +1

Eigendecompositions of Transfer Operators in Reproducing Kernel Hilbert Spaces

1 code implementation5 Dec 2017 Stefan Klus, Ingmar Schuster, Krikamol Muandet

Transfer operators such as the Perron--Frobenius or Koopman operator play an important role in the global analysis of complex dynamical systems.

Kernel Sequential Monte Carlo

1 code implementation11 Oct 2015 Ingmar Schuster, Heiko Strathmann, Brooks Paige, Dino Sejdinovic

As KSMC does not require access to target gradients, it is particularly applicable on targets whose gradients are unknown or prohibitively expensive.

Gradient Importance Sampling

no code implementations21 Jul 2015 Ingmar Schuster

Adaptive Monte Carlo schemes developed over the last years usually seek to ensure ergodicity of the sampling process in line with MCMC tradition.

A Bayesian Model of node interaction in networks

no code implementations18 Feb 2014 Ingmar Schuster

We are concerned with modeling the strength of links in networks by taking into account how often those links are used.

BIG-bench Machine Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.