Award Date

1-1-1993

Degree Type

Thesis

Degree Name

Master of Science (MS)

Department

Computer Science

Number of Pages

194

Abstract

The thesis examines sequential learning in a neural network model derived by M. I. Jordan and J. L. Elman. In each of three experiments, different network parameters are systematically altered in a series of simulations. Each simulation measures learning ability for a specific network configuration. Simulation results are consolidated to summarize each parameter's significance in the learning process.

Keywords

Adjustment; Dynamic; Effects; Learning; Networks; Parameters; Recurrent

Controlled Subject

Computer science

File Format

pdf

File Size

4567.04 KB

Degree Grantor

University of Nevada, Las Vegas

Language

English

Permissions

If you are the rightful copyright holder of this dissertation or thesis and wish to have the full text removed from Digital Scholarship@UNLV, please submit a request to digitalscholarship@unlv.edu and include clear identification of the work, preferably with URL.

Identifier

https://doi.org/10.25669/td4n-2dd5


Share

COinS