Adding Learning to the Cellular Development of Neural Networks: Evolution and the Baldwin Effect
- 1 September 1993
- journal article
- Published by MIT Press in Evolutionary Computation
- Vol. 1 (3), 213-233
- https://doi.org/10.1162/evco.1993.1.3.213
Abstract
A grammar tree is used to encode a cellular developmental process that can generate whole families of Boolean neural networks for computing parity and symmetry. The development process resembles biological cell division. A genetic algorithm is used to find a grammar tree that yields both architecture and weights specifying a particular neural network for solving specific Boolean functions. The current study particularly focuses on the addition of learning to the development process and the evolution of grammar trees. Three ways of adding learning to the development process are explored. Two of these exploit the Baldwin effect by changing the fitness landscape without using Lamarckian evolution. The third strategy is Lamarckian in nature. Results for these three modes of combining learning with genetic search are compared against genetic search without learning. Our results suggest that merely using learning to change the fitness landscape can be as effective as Lamarckian strategies at improving search.Keywords
This publication has 4 references indexed in Scilit:
- Genetic algorithms and neural networks: optimizing connections and connectivityParallel Computing, 1990
- Limitations of multi-layer perceptron networks - steps towards genetic neural networksParallel Computing, 1990
- Scaling, machine learning, and genetic neural netsAdvances in Applied Mathematics, 1989
- A learning algorithm for boltzmann machinesCognitive Science, 1985