Boosted discriminant projections for nearest neighbor classification

David Masip, Jordi Vitrià

Research output: Contribution to journalArticleResearchpeer-review

18 Citations (Scopus)


In this paper we introduce a new embedding technique to find the linear projection that best projects labeled data samples into a new space where the performance of a Nearest Neighbor classifier is maximized. We consider a large set of one-dimensional projections and combine them into a projection matrix, which is not restricted to be orthogonal. The embedding is defined as a classifier selection task that makes use of the AdaBoost algorithm to find an optimal set of discriminant projections. The main advantage of the algorithm is that the final projection matrix does not make any global assumption on the data distribution, and the projection matrix is created by minimizing the classification error in the training data set. Also the resulting features can be ranked according to a set of coefficients computed during the algorithm. The performance of our embedding is tested in two different pattern recognition tasks, a gender recognition problem and the classification of manuscript digits. © 2005 Pattern Recognition Society. Published by Elsevier Ltd. All rights reserved.
Original languageEnglish
Pages (from-to)164-170
JournalPattern Recognition
Issue number2
Publication statusPublished - 1 Feb 2006


  • Boosting
  • Classifier selection
  • Dimensionality reduction
  • Feature extraction
  • Linear discriminant analysis
  • Prototype selection


Dive into the research topics of 'Boosted discriminant projections for nearest neighbor classification'. Together they form a unique fingerprint.

Cite this