On the selection and classification of independent features

Marco Bressan, Jordi Vitrià

    Research output: Contribution to journalArticleResearchpeer-review

    37 Citations (Scopus)

    Abstract

    This paper is focused on the problems of feature selection and classification when classes are modeled by statistically independent features. We show that, under the assumption of class-conditional independence, the class separability measure of divergence is greatly simplified, becoming a sum of unidimensional divergences, providing a feature selection criterion where no exhaustive search is required. Since the hypothesis of independence is infrequently met in practice, we also provide a framework making use of class-conditional Independent Component Analyzers where this assumption can be held on stronger grounds. Divergence and the Bayes decision scheme are adapted to this class-conditional representation. An algorithm that integrates the proposed representation, feature selection technique, and classifier is presented. Experiments on artificial, benchmark, and real-world data illustrate our technique and evaluate its performance.
    Original languageEnglish
    Pages (from-to)1312-1317
    JournalIEEE Transactions on Pattern Analysis and Machine Intelligence
    Volume25
    Issue number10
    DOIs
    Publication statusPublished - 1 Oct 2003

    Keywords

    • Divergence
    • Feature selection
    • Independent component analysis
    • Naive Bayes

    Fingerprint Dive into the research topics of 'On the selection and classification of independent features'. Together they form a unique fingerprint.

    Cite this