Abstract
This paper is focused on the problems of feature selection and classification when classes are modeled by statistically independent features. We show that, under the assumption of class-conditional independence, the class separability measure of divergence is greatly simplified, becoming a sum of unidimensional divergences, providing a feature selection criterion where no exhaustive search is required. Since the hypothesis of independence is infrequently met in practice, we also provide a framework making use of class-conditional Independent Component Analyzers where this assumption can be held on stronger grounds. Divergence and the Bayes decision scheme are adapted to this class-conditional representation. An algorithm that integrates the proposed representation, feature selection technique, and classifier is presented. Experiments on artificial, benchmark, and real-world data illustrate our technique and evaluate its performance.
Original language | English |
---|---|
Pages (from-to) | 1312-1317 |
Journal | IEEE Transactions on Pattern Analysis and Machine Intelligence |
Volume | 25 |
Issue number | 10 |
DOIs | |
Publication status | Published - 1 Oct 2003 |
Keywords
- Divergence
- Feature selection
- Independent component analysis
- Naive Bayes