Paper

Reconstructing a dynamical system and forecasting time series by self-consistent deep learning

We introduce a self-consistent deep-learning framework which, for a noisy deterministic time series, provides unsupervised filtering, state-space reconstruction, identification of the underlying differential equations and forecasting. Without a priori information on the signal, we embed the time series in a state space, where deterministic structures, i.e. attractors, are revealed. Under the assumption that the evolution of solution trajectories is described by an unknown dynamical system, we filter out stochastic outliers. The embedding function, the solution trajectories and the dynamical systems are constructed using deep neural networks, respectively. By exploiting the differentiability of the neural solution trajectory, the neural dynamical system is defined locally at each time, mitigating the need for propagating gradients through numerical solvers. On a chaotic time series masked by additive Gaussian noise, we demonstrate the filtering ability and the predictive power of the proposed framework.

Results in Papers With Code
(↓ scroll down to see all results)