Recursive backpropagation algorithm applied to a globally recurrent neural network

Steven Michael Dionisi, University of Nevada, Las Vegas

Abstract

In general, recursive neural networks can yield a smaller structure than purely feedforward neural network in the same way infinite impulse response (IIR) filters can replace longer finite impulse response (FIR) filters. This thesis presents a new adaptive algorithm that trains recursive neural networks. This algorithm is based on least mean square (LMS) algorithms designed for other adaptive architectures. This algorithm overcomes several of the limitations of current recursive neural network algorithms, such as epoch training and the requirement for large amounts of memory storage; To demonstrate this new algorithm, adaptive architectures constructed with a recursive neural network and trained with the new algorithm are applied to the four adaptive systems and the results are compared to adaptive systems constructed with other adaptive filters. In these examples, this new algorithm shows the ability to perform linear and nonlinear transformations and, in some cases, significantly outperforms the other adaptive filters. This thesis also discusses the possible avenues for future exploration of adaptive systems constructed of recursive neural networks.