Award Date

1-1-1994

Degree Type

Thesis

Degree Name

Master of Science (MS)

Department

Electrical and Computer Engineering

Number of Pages

138

Abstract

In general, recursive neural networks can yield a smaller structure than purely feedforward neural network in the same way infinite impulse response (IIR) filters can replace longer finite impulse response (FIR) filters. This thesis presents a new adaptive algorithm that trains recursive neural networks. This algorithm is based on least mean square (LMS) algorithms designed for other adaptive architectures. This algorithm overcomes several of the limitations of current recursive neural network algorithms, such as epoch training and the requirement for large amounts of memory storage; To demonstrate this new algorithm, adaptive architectures constructed with a recursive neural network and trained with the new algorithm are applied to the four adaptive systems and the results are compared to adaptive systems constructed with other adaptive filters. In these examples, this new algorithm shows the ability to perform linear and nonlinear transformations and, in some cases, significantly outperforms the other adaptive filters. This thesis also discusses the possible avenues for future exploration of adaptive systems constructed of recursive neural networks.

Keywords

Algorithm; Applied; Backpropagation; Globally; Networks; Neural; Recurrent; Recursive

Controlled Subject

Electrical engineering; Computer science; Artificial intelligence

File Format

pdf

File Size

4003.84 KB

Degree Grantor

University of Nevada, Las Vegas

Language

English

Permissions

If you are the rightful copyright holder of this dissertation or thesis and wish to have the full text removed from Digital Scholarship@UNLV, please submit a request to digitalscholarship@unlv.edu and include clear identification of the work, preferably with URL.

Rights

IN COPYRIGHT. For more information about this rights statement, please visit http://rightsstatements.org/vocab/InC/1.0/


COinS