1 code implementation • 28 Jan 2021 • Kashif Rasul, Calvin Seward, Ingmar Schuster, Roland Vollgraf
In this work, we propose \texttt{TimeGrad}, an autoregressive model for multivariate probabilistic time series forecasting which samples from the data distribution at each time step by estimating its gradient.
Multivariate Time Series Forecasting Probabilistic Time Series Forecasting +1
1 code implementation • 25 Nov 2020 • Patrick Gelß, Stefan Klus, Ingmar Schuster, Christof Schütte
We propose a method for the approximation of high- or even infinite-dimensional feature vectors, which play an important role in supervised learning.
1 code implementation • ICLR 2021 • Kashif Rasul, Abdul-Saboor Sheikh, Ingmar Schuster, Urs Bergmann, Roland Vollgraf
In this work we model the multivariate temporal dynamics of time series via an autoregressive deep learning model, where the data distribution is represented by a conditioned normalizing flow.
no code implementations • 2 Dec 2019 • Ilja Klebanov, Ingmar Schuster, T. J. Sullivan
Conditional mean embeddings (CMEs) have proven themselves to be a powerful tool in many machine learning applications.
no code implementations • 6 Sep 2019 • Kashif Rasul, Ingmar Schuster, Roland Vollgraf, Urs Bergmann
We present a generative model that is defined on finite sets of exchangeable, potentially high dimensional, data.
no code implementations • 27 May 2019 • Ingmar Schuster, Mattes Mollenhauer, Stefan Klus, Krikamol Muandet
The proposed model is based on a novel approach to the reconstruction of probability densities from their kernel mean embeddings by drawing connections to estimation of Radon-Nikodym derivatives in the reproducing kernel Hilbert space (RKHS).
no code implementations • 28 Sep 2018 • Stefan Klus, Andreas Bittracher, Ingmar Schuster, Christof Schütte
We present a novel machine learning approach to understanding conformation dynamics of biomolecules.
no code implementations • 24 Jul 2018 • Mattes Mollenhauer, Ingmar Schuster, Stefan Klus, Christof Schütte
Reproducing kernel Hilbert spaces (RKHSs) play an important role in many statistics and machine learning applications ranging from support vector machines to Gaussian processes and kernel embeddings of distributions.
no code implementations • 18 May 2018 • Ingmar Schuster, Ilja Klebanov
As a by-product it enables estimating the normalizing constant, an important quantity in Bayesian machine learning and statistics.
no code implementations • 16 May 2018 • Stefan Klus, Sebastian Peitz, Ingmar Schuster
Kernel transfer operators, which can be regarded as approximations of transfer operators such as the Perron-Frobenius or Koopman operator in reproducing kernel Hilbert spaces, are defined in terms of covariance and cross-covariance operators and have been shown to be closely related to the conditional mean embedding framework developed by the machine learning community.
1 code implementation • 5 Dec 2017 • Stefan Klus, Ingmar Schuster, Krikamol Muandet
Transfer operators such as the Perron--Frobenius or Koopman operator play an important role in the global analysis of complex dynamical systems.
1 code implementation • 11 Oct 2015 • Ingmar Schuster, Heiko Strathmann, Brooks Paige, Dino Sejdinovic
As KSMC does not require access to target gradients, it is particularly applicable on targets whose gradients are unknown or prohibitively expensive.
no code implementations • 21 Jul 2015 • Ingmar Schuster
Adaptive Monte Carlo schemes developed over the last years usually seek to ensure ergodicity of the sampling process in line with MCMC tradition.
no code implementations • 18 Feb 2014 • Ingmar Schuster
We are concerned with modeling the strength of links in networks by taking into account how often those links are used.