Search Results for author: Zhixin Lu

Found 4 papers, 0 papers with code

Learning Continuous Chaotic Attractors with a Reservoir Computer

no code implementations16 Oct 2021 Lindsay M. Smith, Jason Z. Kim, Zhixin Lu, Dani S. Bassett

Neural systems are well known for their ability to learn and store information as memories.

Teaching Recurrent Neural Networks to Modify Chaotic Memories by Example

no code implementations3 May 2020 Jason Z. Kim, Zhixin Lu, Erfan Nozari, George J. Pappas, Danielle S. Bassett

Here we demonstrate that a recurrent neural network (RNN) can learn to modify its representation of complex information using only examples, and we explain the associated learning mechanism with new theory.

Time Series Time Series Analysis

Using Machine Learning to Replicate Chaotic Attractors and Calculate Lyapunov Exponents from Data

no code implementations19 Oct 2017 Jaideep Pathak, Zhixin Lu, Brian R. Hunt, Michelle Girvan, Edward Ott

For the case of the KS equation, we note that as the system's spatial size is increased, the number of Lyapunov exponents increases, thus yielding a challenging test of our method, which we find the method successfully passes.

Chaotic Dynamics

Reservoir observers: Model-free inference of unmeasured variables in chaotic systems

no code implementations Chaos 27, 041102 (2017) 2017 Zhixin Lu, Jaideep Pathak, Brian Hunt, Michelle Girvan, Roger Brockett, and Edward Ott

A scheme that accomplishes this is called an “observer.” We consider the case in which a model of the system is unavailable or insufficiently accurate, but “training” time series data of the desired state variables are available for a short period of time, and a limited number of other system variables are continually measured.

Time Series Analysis

Cannot find the paper you are looking for? You can Submit a new open access paper.