Training neural networks with additive noise in the desired signal
- 1 January 1999
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Neural Networks
- Vol. 10 (6), 1511-1517
- https://doi.org/10.1109/72.809097
Abstract
A new global optimization strategy for training adaptive systems such as neural networks and adaptive filters [finite or infinite impulse response (FIR or IIR)] is proposed in this paper. Instead of adding random noise to the weights as proposed in the past, additive random noise is injected directly into the desired signal. Experimental results show that this procedure also speeds up greatly the backpropagation algorithm. The method is very easy to implement in practice, preserving the backpropagation algorithm and requiring a single random generator with a monotonically decreasing step size per output channel. Hence, this is an ideal strategy to speed up supervised learning, and avoid local minima entrapment when the noise variance is appropriately scheduled.Keywords
This publication has 22 references indexed in Scilit:
- A global least mean square algorithm for adaptive IIR filteringIEEE Transactions on Circuits and Systems II: Analog and Digital Signal Processing, 1998
- On Langevin Updating in Multilayer PerceptronsNeural Computation, 1994
- Functional approximation by feed-forward networks: a least-squares approach to generalizationIEEE Transactions on Neural Networks, 1994
- An analysis of the gamma memory in dynamic neural networksIEEE Transactions on Neural Networks, 1994
- The gamma model—A new neural model for temporal processingNeural Networks, 1992
- Generalization in a linear perceptron in the presence of noiseJournal of Physics A: General Physics, 1992
- Noise injection into inputs in back-propagation learningIEEE Transactions on Systems, Man, and Cybernetics, 1992
- Neural Network Classifiers Estimate Bayesian a posteriori ProbabilitiesNeural Computation, 1991
- Generalization of backpropagation with application to a recurrent gas market modelNeural Networks, 1988
- Optimization by Simulated AnnealingScience, 1983