Quantization Complexity and Independent Measurements

Abstract
It is known that, in general, the number of measurements in a pattern classification problem cannot be increased arbitrarily, when the class-conditional densities are not completely known and only a finite number of learning samples are available. Above a certain number of measurements, the performance starts deteriorating instead of improving steadily. It was earlier shown by one of the authors that an exception to this "curse of finite sample size" is constituted by the case of binary independent measurements if a Bayesian approach is taken and uniform a priori on the unknown parameters are assumed. In this paper, the following generalizations are considered: arbitrary quantization and the use of maximum likelihood estimates. Further, the existence of an optimal quantization complexity is demonstrated, and its relationship to both the dimensionality of the measurement vector and the sample size are discussed. It is shown that the optimum number of quantization levels decreases with increasing dimensionality for a fixed sample size, and increases with the sample size for fixed dimensionality.

This publication has 4 references indexed in Scilit: