Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:14:56
11 Jun 2021

Reservoir computers are a fast training variant of recurrent neural networks, excelling at approximation of nonlinear dynamical systems and time series prediction. These machine learning models act as self-organizing nonlinear fading memory filters. While these models benefit from low overall complexity, the matrix computations are a complexity bottleneck. This work applies the controllability matrix of control theory to quickly identify a reduced size replacement reservoir. Given a large, task-effective reservoir matrix, we calculate the rank of the associated controllability matrix. This simple calculation identifies the required rank for a reduced size replacement, resulting in time speed-ups to an already fast deep learning model. Additionally, this rank calculation speaks to the state space reachable set required to model the input data.

Chairs:
Dionysios Kalogerias

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00