Abstract
In this paper we focus on an interpretation of Gaussian radial basis functions (GRBF) which motivates extensions and learning strategies. Specifically, we show that GRBF regression equations naturally result from representing the input-output joint probability density function by a finite mixture of Gaussians. Corollaries of this interpretation are: some special forms of GRBF representations can be traced back to the type of Gaussian mixture used; previously pro- posed learning methods based on input-output clustering have a new meaning; finally, estimation techniques for finite mixtures (namely the EM algorithm, and model selection criteria) can be invoked to learn GRBF regression equa- tions.

This publication has 11 references indexed in Scilit: