Fast Kernel Generalized Discriminative Common Vectors for Feature Extraction

Katerine Diaz-Chito, Jesús Martínez del Rincón, Aura Hernández-Sabaté, Marçal Rusiñol, Francesc J. Ferri

Research output: Contribution to journalArticleResearchpeer-review

2 Citations (Scopus)


© 2017, Springer Science+Business Media, LLC. This paper presents a supervised subspace learning method called Kernel Generalized Discriminative Common Vectors (KGDCV), as a novel extension of the known Discriminative Common Vectors method with Kernels. Our method combines the advantages of kernel methods to model complex data and solve nonlinear problems with moderate computational complexity, with the better generalization properties of generalized approaches for large dimensional data. These attractive combination makes KGDCV specially suited for feature extraction and classification in computer vision, image processing and pattern recognition applications. Two different approaches to this generalization are proposed: a first one based on the Kernel Trick and a second one based on the Nonlinear Projection Trick (NPT) for even higher efficiency. Both methodologies have been validated on four different image datasets containing faces, objects and handwritten digits and compared against well-known nonlinear state-of-the-art methods. Results show better discriminant properties than other generalized approaches both linear or kernel. In addition, the KGDCV-NPT approach presents a considerable computational gain, without compromising the accuracy of the model.
Original languageEnglish
Pages (from-to)512-524
JournalJournal of Mathematical Imaging and Vision
Issue number4
Publication statusPublished - 1 May 2018


  • Computational efficiency
  • Kernel Discriminative Common Vectors
  • Kernel Trick
  • Nonlinear Projection Trick
  • Nonlinear feature extraction


Dive into the research topics of 'Fast Kernel Generalized Discriminative Common Vectors for Feature Extraction'. Together they form a unique fingerprint.

Cite this