Occlusion Handling via Random Subspace Classifiers for Human Detection

Javier Marin, David Vazquez, Antonio M. Lopez, Jaume Amores, Ludmila I. Kuncheva

Producció científica: Contribució a revistaArticleRecercaAvaluat per experts

45 Cites (Scopus)

Resum

This paper describes a general method to address partial occlusions for human detection in still images. The random subspace method (RSM) is chosen for building a classifier ensemble robust against partial occlusions. The component classifiers are chosen on the basis of their individual and combined performance. The main contribution of this work lies in our approach's capability to improve the detection rate when partial occlusions are present without compromising the detection performance on non occluded data. In contrast to many recent approaches, we propose a method which does not require manual labeling of body parts, defining any semantic spatial components, or using additional data coming from motion or stereo. Moreover, the method can be easily extended to other object classes. The experiments are performed on three large datasets: the INRIA person dataset, the Daimler Multicue dataset, and a new challenging dataset, called PobleSec, in which a considerable number of targets are partially occluded. The different approaches are evaluated at the classification and detection levels for both partially occluded and non-occluded data. The experimental results show that our detector outperforms state-of-the-art approaches in the presence of partial occlusions, while offering performance and reliability similar to those of the holistic approach on non-occluded data. The datasets used in our experiments have been made publicly available for benchmarking purposes.
Idioma originalAnglès
Pàgines (de-a)342-354
Nombre de pàgines13
RevistaIEEE Transactions on Cybernetics
Volum44
Número3
DOIs
Estat de la publicacióPublicada - de març 2014

Fingerprint

Navegar pels temes de recerca de 'Occlusion Handling via Random Subspace Classifiers for Human Detection'. Junts formen un fingerprint únic.

Com citar-ho