Search Results for author: Patrick Kidger

Found 16 papers, 12 papers with code

On Neural Differential Equations

3 code implementations4 Feb 2022 Patrick Kidger

Topics include: neural ordinary differential equations (e. g. for hybrid neural/mechanistic modelling of physical systems); neural controlled differential equations (e. g. for learning functions of irregular time series); and neural stochastic differential equations (e. g. to produce generative models capable of representing complex stochastic dynamics, or sampling from complex high-dimensional distributions).

Irregular Time Series Symbolic Regression +2

Equinox: neural networks in JAX via callable PyTrees and filtered transformations

1 code implementation30 Oct 2021 Patrick Kidger, Cristian Garcia

One: parameterised functions are themselves represented as `PyTrees', which means that the parameterisation of a function is transparent to the JAX framework.

Neural Controlled Differential Equations for Online Prediction Tasks

2 code implementations21 Jun 2021 James Morrill, Patrick Kidger, Lingyi Yang, Terry Lyons

This is fine when the whole time series is observed in advance, but means that Neural CDEs are not suitable for use in \textit{online prediction tasks}, where predictions need to be made in real-time: a major use case for recurrent networks.

Irregular Time Series Time Series +1

Efficient and Accurate Gradients for Neural SDEs

2 code implementations NeurIPS 2021 Patrick Kidger, James Foster, Xuechen Li, Terry Lyons

This reduces computational cost (giving up to a $1. 87\times$ speedup) and removes the numerical truncation errors associated with gradient penalty.

Neural SDEs as Infinite-Dimensional GANs

1 code implementation6 Feb 2021 Patrick Kidger, James Foster, Xuechen Li, Harald Oberhauser, Terry Lyons

Stochastic differential equations (SDEs) are a staple of mathematical modelling of temporal dynamics.

Time Series Time Series Analysis

"Hey, that's not an ODE'": Faster ODE Adjoints with 12 Lines of Code

no code implementations1 Jan 2021 Patrick Kidger, Ricky T. Q. Chen, Terry Lyons

Neural differential equations may be trained by backpropagating gradients via the adjoint method, which is another differential equation typically solved using an adaptive-step-size numerical differential equation solver.

Time Series Time Series Analysis

Neural SDEs Made Easy: SDEs are Infinite-Dimensional GANs

no code implementations1 Jan 2021 Patrick Kidger, James Foster, Xuechen Li, Harald Oberhauser, Terry Lyons

Several authors have introduced \emph{Neural Stochastic Differential Equations} (Neural SDEs), often involving complex theory with various limitations.

Neural CDEs for Long Time Series via the Log-ODE Method

no code implementations28 Sep 2020 James Morrill, Patrick Kidger, Cristopher Salvi, James Foster, Terry Lyons

Neural Controlled Differential Equations (Neural CDEs) are the continuous-time analogue of an RNN, just as Neural ODEs are analogous to ResNets.

Time Series Time Series Analysis

"Hey, that's not an ODE": Faster ODE Adjoints via Seminorms

3 code implementations20 Sep 2020 Patrick Kidger, Ricky T. Q. Chen, Terry Lyons

Neural differential equations may be trained by backpropagating gradients via the adjoint method, which is another differential equation typically solved using an adaptive-step-size numerical differential equation solver.

Time Series Time Series Analysis

Neural Rough Differential Equations for Long Time Series

3 code implementations17 Sep 2020 James Morrill, Cristopher Salvi, Patrick Kidger, James Foster, Terry Lyons

Neural controlled differential equations (CDEs) are the continuous-time analogue of recurrent neural networks, as Neural ODEs are to residual networks, and offer a memory-efficient continuous-time way to model functions of potentially irregular time series.

Irregular Time Series Time Series +2

Generalised Interpretable Shapelets for Irregular Time Series

2 code implementations28 May 2020 Patrick Kidger, James Morrill, Terry Lyons

The shapelet transform is a form of feature extraction for time series, in which a time series is described by its similarity to each of a collection of `shapelets'.

Audio Classification Irregular Time Series +2

Neural Controlled Differential Equations for Irregular Time Series

5 code implementations NeurIPS 2020 Patrick Kidger, James Morrill, James Foster, Terry Lyons

The resulting \emph{neural controlled differential equation} model is directly applicable to the general setting of partially-observed irregularly-sampled multivariate time series, and (unlike previous work on this problem) it may utilise memory-efficient adjoint-based backpropagation even across observations.

Irregular Time Series Time Series +1

Signatory: differentiable computations of the signature and logsignature transforms, on both CPU and GPU

1 code implementation ICLR 2021 Patrick Kidger, Terry Lyons

Signatory is a library for calculating and performing functionality related to the signature and logsignature transforms.

Deep Signature Transforms

3 code implementations NeurIPS 2019 Patric Bonnier, Patrick Kidger, Imanol Perez Arribas, Cristopher Salvi, Terry Lyons

The signature is an infinite graded sequence of statistics known to characterise a stream of data up to a negligible equivalence class.

Universal Approximation with Deep Narrow Networks

no code implementations21 May 2019 Patrick Kidger, Terry Lyons

The classical Universal Approximation Theorem holds for neural networks of arbitrary width and bounded depth.

Cannot find the paper you are looking for? You can Submit a new open access paper.