Award Date
1-1-1993
Degree Type
Thesis
Degree Name
Master of Science (MS)
Department
Computer Science
Number of Pages
194
Abstract
The thesis examines sequential learning in a neural network model derived by M. I. Jordan and J. L. Elman. In each of three experiments, different network parameters are systematically altered in a series of simulations. Each simulation measures learning ability for a specific network configuration. Simulation results are consolidated to summarize each parameter's significance in the learning process.
Keywords
Adjustment; Dynamic; Effects; Learning; Networks; Parameters; Recurrent
Controlled Subject
Computer science
File Format
File Size
4567.04 KB
Degree Grantor
University of Nevada, Las Vegas
Language
English
Permissions
If you are the rightful copyright holder of this dissertation or thesis and wish to have the full text removed from Digital Scholarship@UNLV, please submit a request to digitalscholarship@unlv.edu and include clear identification of the work, preferably with URL.
Repository Citation
Felgar, Stephen Lee, "On the effect of dynamic adjustment of recurrent network parameters on learning" (1993). UNLV Retrospective Theses & Dissertations. 303.
http://dx.doi.org/10.25669/td4n-2dd5
Rights
IN COPYRIGHT. For more information about this rights statement, please visit http://rightsstatements.org/vocab/InC/1.0/
COinS