A MODIFIED ERROR BACKPROPAGATION ALGORITHM FOR COMPLEX-VALUE NEURAL NETWORKS
- 1 December 2005
- journal article
- research article
- Published by World Scientific Pub Co Pte Ltd in International Journal of Neural Systems
- Vol. 15 (6), 435-443
- https://doi.org/10.1142/s0129065705000426
Abstract
The complex-valued backpropagation algorithm has been widely used in fields of dealing with telecommunications, speech recognition and image processing with Fourier transformation. However, the local minima problem usually occurs in the process of learning. To solve this problem and to speed up the learning process, we propose a modified error function by adding a term to the conventional error function, which is corresponding to the hidden layer error. The simulation results show that the proposed algorithm is capable of preventing the learning from sticking into the local minima and of speeding up the learning.Keywords
This publication has 9 references indexed in Scilit:
- A modified error function for the backpropagation algorithmNeurocomputing, 2004
- Solving the XOR problem and the detection of symmetry using a single complex-valued neuronNeural Networks, 2003
- Approximation by Fully Complex Multilayer PerceptronsNeural Computation, 2003
- Fully Complex Multi-Layer Perceptron Network for Nonlinear Signal ProcessingJournal of Signal Processing Systems, 2002
- Training neural networks with additive noise in the desired signalIEEE Transactions on Neural Networks, 1999
- An Extension of the Back-Propagation Algorithm to Complex NumbersNeural Networks, 1997
- Complex domain backpropagationIEEE Transactions on Circuits and Systems II: Analog and Digital Signal Processing, 1992
- On the complex backpropagation algorithmIEEE Transactions on Signal Processing, 1992
- Approximation by superpositions of a sigmoidal functionMathematics of Control, Signals, and Systems, 1989