Quantitative Measures to evaluate Neural Network Weight Initialization Strategies
Document Type
Conference Proceeding
Publication Date
1-9-2017
Publication Title
IEEE 7th Annual Computing and Communication Workshop and Conference
Volume
2017
Abstract
It has been reported numerous times in the neural network research literature that weight initialization in neural networks affects the learning rate, the convergence rate and the probability of correct classification. In this research paper we develop a theory for objectively testing various weight initialization strategies. Our theory provides a quantitative measure for each available weight initialization strategy. Thus for each initialization strategy and each epoch we estimate the conditional probability distribution function of correct classification given the epoch number. For each initialization strategy and for a given epoch the conditional probability is a random variable with certain probability distribution function and certain mean and variance. Based on multivariate analysis, statistics of extremes, analysis of variance and estimation theory we develop an objective framework and measurements to assess if one strategy is better than another or if the differences between strategies are not significant but they are due to random fluctuations.
Language
english
Repository Citation
Zamora, E.,
Nakakuni, M.,
Yfantis, E. A.
(2017).
Quantitative Measures to evaluate Neural Network Weight Initialization Strategies.
IEEE 7th Annual Computing and Communication Workshop and Conference, 2017
http://dx.doi.org/10.1109/CCWC.2017.7868389