A stochastically motivated random initialization of pattern classifying MLPs
- 1 April 1996
- journal article
- Published by Springer Science and Business Media LLC in Neural Processing Letters
- Vol. 3 (1), 23-29
- https://doi.org/10.1007/bf00417786
Abstract
In this contribution, a new stochastically motivated random weight initialization scheme for pattern classifying Multi-Layer Perceptrons (MLPs) is presented. Its first aim is to ensure that all training examples and all nodes have an equal opportunity to contribute to the improvement of the network during the Error Back-Propagation (EBP) training. In addition, it pursues input scale invariance: if the network inputs were substituted by rescaled inputs, the initialization procedure should provide an equally well performing network. Finally, the new algorithm can initialize MLPs comprising both concentric (e.g., Gaussian) and squashing (e.g., sigmoidal) nodes. Experiments demonstrate that networks initialized using the proposed method train better than networks initialized using a standard random initialization scheme.Keywords
This publication has 5 references indexed in Scilit:
- Phonetic classification using multi-layer perceptronsPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- On the initialization and optimization of multilayer perceptronsIEEE Transactions on Neural Networks, 1994
- Designing multilayer perceptrons from nearest-neighbor systemsIEEE Transactions on Neural Networks, 1992
- A fast and robust learning algorithm for feedforward neural networksNeural Networks, 1991
- Parallel Distributed ProcessingPublished by MIT Press ,1986