Theory and development of higher-order CMAC neural networks

Abstract
The cerebellar model articulation controller (CMAC) neural network is capable of learning nonlinear functions extremely quickly due to the local nature of its weight updating. The rectangular shape of CMAC receptive field functions, however, produces discontinuous (staircase) function approximations without inherent analytical derivatives. The ability to learn both functions and function derivatives is important for the development of many online adaptive filter, estimation, and control algorithms. It is shown that use of B-spline receptive field functions in conjunction with more general CMAC weight addressing schemes allows higher-order CMAC neural networks to be developed that can learn both functions and function derivatives. This also allows hierarchical and multilayer CMAC network architectures to be constructed that can be trained using standard error back-propagation learning techniques.

This publication has 15 references indexed in Scilit: