Echo state networks (ESNs) are a type of recurrent neural model with a fixed internal weight structure and an adaptable readout trained using the network's hidden states as features. Since an ESN's weight structure is fixed (and often randomly generated), it is important that we understand how the behaviours which emerge from these networks are influenced by the design choices made in the network's construction. In this thesis, we examine the impact of several of these choices on the behaviour of ESNs. First, we examine the role of network weight structure in determining the memory capacity of ESNs. By drawing connections with results in control theory, we derive an expression for the memory capacity of a linear network in terms of structures within its weights. Next, we show that the previously reported phenomenon of deeper layers operating at slower `time-scales' is exhibited even by linear networks, and we provide deeper insights into this behaviour by examining the sensitivity to perturbation and memory capacity of different layers. Finally, we examine the asymptotic behaviour of linear networks, and input-driven linear systems more generally. In the cases where the network is stable, we show properties which emerge from this stability, and construct tail bounds on the components of the hidden state when the network's input is perturbed by noise. In cases of instability, we construct bounds on the expectation of the hidden state's norm in the presence of noisy input.
Date of Award | 1 Aug 2020 |
---|
Original language | English |
---|
Awarding Institution | - The University of Manchester
|
---|
- Echo State Networks
- Recurrent Neural Networks
Effects of Network Weight Structure in Echo State Networks
Wood, D. (Author). 1 Aug 2020
Student thesis: Phd