Final submission docs:

Link to poster
Link to video presentation
Link to final write-up
Link to code

Link to reflections

Title:

A comparative study between different neural networks for dynamical systems

Who

Alan John Varghese (ajohnvar), Nikhil Kadivar (nkadivar)

Introduction

Nonlinear dynamical systems are widely seen in various systems, ranging from climate to biological systems. In several applications, we only have access to the observed time series, and we lack knowledge of the exact governing equations. For practical applications, we are often interested in forecasting the dynamics of these systems. We aim to leverage deep learning models to solve the above mentioned problem.

One interesting work in this line is the Hamiltonian Neural Network, which is applicable to systems where energy is conserved. Another interesting work is the Multistep Neural Network, which is more general than Hamiltonian Neural Networks. In our project, we consider the following benchmark problems: i) spring-mass system, ii) the three-body problem in mechanics and iii) the Lorenz system (chaotic nonlinear dynamical system applicable to weather data). In our project, we compare the performance of LSTM, HNN and multistep neural networks.

What it does

The training data comprises of the trajectories of the dynamical system for various random initial conditions. During testing, we feed in the initial condition of the dynamical system into the neural network model (coupled with a numerical solver), and it predicts the evolution of the trajectory.

Related work

The idea of Hamiltonian Neural Networks was introduced by Sam Greydanus et al. Their paper introduces neural networks which learn the Hamiltonian of the dynamical system directly from the data.

The multistep neural network model was introduced by Raissi et al. in the paper titled 'Multistep Neural Networks for Data-driven Discovery of Nonlinear Dynamical Systems'. It introduces neural networks which combine the data along with the multistep scheme from numerical analysis to learn the function f(x).

Challenges we ran into

First challenge we faced was how we implemented LSTM on this problem setup. From intuition, we were motivated as since it's a sequence of data, LSTM should be able to at least work. However, actually implementing after preprocessing the data was a bit challenging for us. Second challenge we faced was how to improve the prediction of LSTM. At the end, we concluded that it starts performing decently when we train it for a very high number of iterations. See the results below for the 3 body data set, we got after running the model with two stacked LSTM layers with 200 units each followed by the dense layer. Still we didn't reach the accuracy level of HNN and Multistep neural networks which converged way faster.

Accomplishments that we're proud of

We successfully implemented and tested three neural networks (LSTM, Hamiltonian Neural Networks, and Multistep neural networks) on three data sets. We are really happy with the prediction of multistep neural networks on Lorenz system data.

What we learned

LSTM, Hamiltonian Neural Networks, Multistep Neural Networks

What's next for Deep learning for dynamical systems

The next steps would be performing hyperparameter sensitivity studies for each network model with the dataset and doing some further ablation studies. Also, further steps could be implemeting and testing the Lagrangian Neural Networks (LNNs), which can parameterize arbitrary Lagrangians using neural networks.

Built With

Share this project:

Updates