Master of Science (MS)
Number of Pages
The thesis examines sequential learning in a neural network model derived by M. I. Jordan and J. L. Elman. In each of three experiments, different network parameters are systematically altered in a series of simulations. Each simulation measures learning ability for a specific network configuration. Simulation results are consolidated to summarize each parameter's significance in the learning process.
Adjustment; Dynamic; Effects; Learning; Networks; Parameters; Recurrent
University of Nevada, Las Vegas
If you are the rightful copyright holder of this dissertation or thesis and wish to have the full text removed from Digital Scholarship@UNLV, please submit a request to firstname.lastname@example.org and include clear identification of the work, preferably with URL.
Felgar, Stephen Lee, "On the effect of dynamic adjustment of recurrent network parameters on learning" (1993). UNLV Retrospective Theses & Dissertations. 303.