Master of Science (MS)
Electrical and Computer Engineering
Number of Pages
In general, recursive neural networks can yield a smaller structure than purely feedforward neural network in the same way infinite impulse response (IIR) filters can replace longer finite impulse response (FIR) filters. This thesis presents a new adaptive algorithm that trains recursive neural networks. This algorithm is based on least mean square (LMS) algorithms designed for other adaptive architectures. This algorithm overcomes several of the limitations of current recursive neural network algorithms, such as epoch training and the requirement for large amounts of memory storage; To demonstrate this new algorithm, adaptive architectures constructed with a recursive neural network and trained with the new algorithm are applied to the four adaptive systems and the results are compared to adaptive systems constructed with other adaptive filters. In these examples, this new algorithm shows the ability to perform linear and nonlinear transformations and, in some cases, significantly outperforms the other adaptive filters. This thesis also discusses the possible avenues for future exploration of adaptive systems constructed of recursive neural networks.
Algorithm; Applied; Backpropagation; Globally; Networks; Neural; Recurrent; Recursive
Electrical engineering; Computer science; Artificial intelligence
University of Nevada, Las Vegas
If you are the rightful copyright holder of this dissertation or thesis and wish to have the full text removed from Digital Scholarship@UNLV, please submit a request to firstname.lastname@example.org and include clear identification of the work, preferably with URL.
Dionisi, Steven Michael, "Recursive backpropagation algorithm applied to a globally recurrent neural network" (1994). UNLV Retrospective Theses & Dissertations. 442.
IN COPYRIGHT. For more information about this rights statement, please visit http://rightsstatements.org/vocab/InC/1.0/