Evolving space-filling curves to distribute radial basis functions over an input space
- 1 January 1994
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Neural Networks
- Vol. 5 (1), 15-23
- https://doi.org/10.1109/72.265957
Abstract
An evolutionary neural network training algorithm is proposed for radial basis function (RBF) networks. The locations of basis function centers are not directly encoded in a genetic string, but are governed by space-filling curves whose parameters evolve genetically. This encoding causes each group of codetermined basis functions to evolve to fit a region of the input space. A network produced from this encoding is evaluated by training its output connections only. Networks produced by this evolutionary algorithm appear to have better generalization performance on the Mackey-Glass time series than corresponding networks whose centers are determined by k-means clustering.Keywords
This publication has 24 references indexed in Scilit:
- A REVIEW OF EVOLUTIONARY ARTIFICIAL NEURAL NETWORKSInternational Journal of Intelligent Systems, 1993
- On the training of radial basis function classifiersNeural Networks, 1992
- Predicting the Future: Advantages of Semilocal UnitsNeural Computation, 1991
- A Resource-Allocating Network for Function InterpolationNeural Computation, 1991
- Orthogonal least squares learning algorithm for radial basis function networksIEEE Transactions on Neural Networks, 1991
- Networks for approximation and learningProceedings of the IEEE, 1990
- Self-organizing network for optimum supervised learningIEEE Transactions on Neural Networks, 1990
- Fast Learning in Networks of Locally-Tuned Processing UnitsNeural Computation, 1989
- Predicting chaotic time seriesPhysical Review Letters, 1987
- Oscillation and Chaos in Physiological Control SystemsScience, 1977