To main content

Linear Antisymmetric Recurrent Neural Networks

Abstract

Recurrent Neural Networks (RNNs) have a form of memory where the output from a node at
one timestep is fed back as input the next timestep in addition to data from the previous layer.
This makes them highly suitable for timeseries analysis. However, standard RNNs have known
weaknesses such as struggling with long-term memory. In this paper, we suggest a new recurrent
network structure called Linear Antisymmetric RNN (LARNN). This structure is based on the
numerical solution to an Ordinary Differential Equation (ODE) with stability properties resulting
in a stable solution, which corresponds to long-term memory. Three different numerical methods
are suggested to solve the ODE: Forward and Backward Euler and the midpoint method. The
suggested structure has been implemented in Keras and several simulated datasets have been used
to evaluate the performance. In the investigated cases, the LARNN performs better or similar to the
Long Short Term Memory (LSTM) network which is the current state of the art for RNNs.
Read the publication

Category

Academic article

Language

English

Author(s)

Affiliation

  • SINTEF Digital / Mathematics and Cybernetics
  • Norwegian University of Science and Technology

Year

2020

Published in

Proceedings of Machine Learning Research (PMLR)

Volume

120

Page(s)

170 - 178

View this publication at Norwegian Research Information Repository