An information criterion for optimal neural network selection
- 1 January 1991
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Neural Networks
- Vol. 2 (5), 490-497
- https://doi.org/10.1109/72.134286
Abstract
The choice of an optimal neural network design for a given problem is addressed. A relationship between optimal network design and statistical model identification is described. A derivative of Akaike's information criterion (AIC) is given. This modification yields an information statistic which can be used to objectively select a ;best' network for binary classification problems. The technique can be extended to problems with an arbitrary number of classes.Keywords
This publication has 12 references indexed in Scilit:
- Evolving neural networksBiological Cybernetics, 1990
- Statistical analysis of the performance of information theoretic criteria in the detection of the number of signals in array processingIEEE Transactions on Acoustics, Speech, and Signal Processing, 1989
- An evolutionary approach to the traveling salesman problemBiological Cybernetics, 1988
- An algebraic projection analysis for optimal hidden units size and learning rates in back-propagation learningPublished by Institute of Electrical and Electronics Engineers (IEEE) ,1988
- A Bayesian extension of the minimum AIC procedure of autoregressive model fittingBiometrika, 1979
- Modeling by shortest data descriptionAutomatica, 1978
- Estimating the Dimension of a ModelThe Annals of Statistics, 1978
- Selection of the order of an autoregressive model by Akaike's information criterionBiometrika, 1976
- A new look at the statistical model identificationIEEE Transactions on Automatic Control, 1974
- Some Comments on C PTechnometrics, 1973