Boosting the distance estimation: Application to the K-Nearest Neighbor classifier

J. Amores, N. Sebe, P. Radeva

Research output: Contribution to journalArticleResearchpeer-review

39 Citations (Scopus)


In this work we introduce a new distance estimation technique by boosting and we apply it to the K-Nearest Neighbor Classifier (K-NN). Instead of applying AdaBoost to a typical classification problem, we use it for learning a distance function and the resulting distance is used into K-NN. The proposed method (Boosted Distance with Nearest Neighbor) outperforms the AdaBoost classifier when the training set is small. It also outperforms the K-NN classifier used with several different distances and the distances obtained with other estimation methods such as Relevant Component Analysis (RCA) [Duda, R.O., Hart, P.E., Stock, D.G., 2001. Pattern Classification, John Wiley and Sons Inc.]. Furthermore, our distance estimation performs dimension-reduction, being much more efficient in terms of classification accuracy than classical techniques such as PCA, LDA, and NDA. The method has been thoroughly tested on 13 standard databases from the UCI repository, a standard gender recognition database and the MNIST database. © 2005 Elsevier B.V. All rights reserved.
Original languageEnglish
Pages (from-to)201-209
JournalPattern Recognition Letters
Issue number3
Publication statusPublished - 1 Feb 2006


  • AdaBoost
  • Dimension reduction
  • Distance estimation


Dive into the research topics of 'Boosting the distance estimation: Application to the K-Nearest Neighbor classifier'. Together they form a unique fingerprint.

Cite this