Multiple-prototype classifier design

Abstract
Five methods that generate multiple prototypes from labeled data are reviewed. Then we introduce a new sixth approach, which is a modification of Chang's (1974) method. We compare the six methods with two standard classifier designs: the 1-nearest prototype (1-np) and 1-nearest neighbor (1-nn) rules. The standard of comparison is the resubstitution error rate; the data used are the Iris data. Our modified Chang's method produces the best consistent (zero-error) design. One of the competitive learning models produces the best minimal prototypes design (five prototypes that yield three resubstitution errors).

This publication has 12 references indexed in Scilit: