Combining cross-validation and confidence to measure fitness
- 22 January 2003
- conference paper
- conference paper
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IJCNN'99. International Joint Conference on Neural Networks. Proceedings (Cat. No.99CH36339)
- Vol. 2, 1409-1414 vol.2
- https://doi.org/10.1109/ijcnn.1999.831170
Abstract
Neural network and machine learning algorithms often have parameters that must be tuned for good performance on a particular task. Leave-one-out cross-validation (LCV) accuracy is often used to measure the fitness of a set of parameter values. However, small changes in parameters often have no effect on LCV accuracy. Many learning algorithms can measure the confidence of a classification decision, but often confidence alone is an inappropriate measure of fitness. This paper proposes a combined measure of cross-validation and confidence (CVC) for obtaining a continuous measure of fitness for sets of parameters in learning algorithms. This paper also proposes the refined instance-based learning algorithm which illustrates the use of CVC in automated parameter tuning. Using CVC provides significant improvement in generalization accuracy on a collection of 31 classification tasks when compared to using LCV.Keywords
This publication has 7 references indexed in Scilit:
- Selecting a classification method by cross-validationMachine Learning, 1993
- A weighted nearest neighbor algorithm for learning with symbolic featuresMachine Learning, 1993
- Instance-based learning algorithmsMachine Learning, 1991
- Toward memory-based reasoningCommunications of the ACM, 1986
- Categorizing numeric information for generalizationCognitive Science, 1985
- The Distance-Weighted k-Nearest-Neighbor RuleIEEE Transactions on Systems, Man, and Cybernetics, 1976
- Nearest neighbor pattern classificationIEEE Transactions on Information Theory, 1967