Abstract
An algorithm is presented for calculating recognition error when applying pattern vectors to an optimum Bayes' classifier. The pattern vectors are assumed to come from two classes whose populations have Gaussian statistics with unequal covariance matrices and arbitrary a priori probabilities. The quadratic discriminant function associated with a Bayes' classifier is used as a one-dimensional random variable from which the probability of error is calculated, once the distribution of the discriminant function is obtained.

This publication has 2 references indexed in Scilit: