Fast Asymmetric Learning for Cascade Face Detection

Abstract
A cascade face detector uses a sequence of node classifiers to distinguish faces from nonfaces. This paper presents a new approach to design node classifiers in the cascade detector. Previous methods used machine learning algorithms that simultaneously select features and form ensemble classifiers. We argue that if these two parts are decoupled, we have the freedom to design a classifier that explicitly addresses the difficulties caused by the asymmetric learning goal. There are three contributions in this paper: The first is a categorization of asymmetries in the learning goal and why they make face detection hard. The second is the forward feature selection (FFS) algorithm and a fast precomputing strategy for AdaBoost. FFS and the fast AdaBoost can reduce the training time by approximately 100 and 50 times, in comparison to a naive implementation of the AdaBoost feature selection method. The last contribution is a linear asymmetric classifier (LAC), a classifier that explicitly handles the asymmetric learning goal as a well-defined constrained optimization problem. We demonstrated experimentally that LAC results in an improved ensemble classifier performance.

This publication has 29 references indexed in Scilit: