Feature Augmentation via Nonparametrics and Selection (FANS) in High Dimensional Classification
We propose a high dimensional classification method that involves nonparametric feature augmentation. Knowing that marginal density ratios are the most powerful univariate classifiers, we use the ratio estimates to transform the original feature measurements. Subsequently, penalized logistic regression is invoked, taking as input the newly transformed or augmented features. This procedure trains models equipped with local complexity and global simplicity, thereby avoiding the curse of dimensionality while creating a flexible nonlinear decision boundary. The resulting method is called Feature Augmentation via Nonparametrics and Selection (FANS). We motivate FANS by generalizing the Naive Bayes model, writing the log ratio of joint densities as a linear combination of those of marginal densities. It is related to generalized additive models, but has better interpretability and computability. Risk bounds are developed for FANS . In numerical analysis, FANS is compared with competing methods, so as to provide a guideline on its best application domain. Real data analysis demonstrates that FANS performs very competitively on benchmark email spam and gene expression data sets. Moreover, FANS is implemented by an extremely fast algorithm through parallel computing.
Date:
28 May 2015, 14:15 (Thursday, 5th week, Trinity 2015)
Venue:
1 South Parks Road, 1 South Parks Road OX1 3TG
Venue Details:
Lecture Theatre
Speaker:
Yang Feng (Department of Statistics, Columbia University)
Organising department:
Department of Statistics
Part of:
Statistics, Applied Probability & Operational Research Seminars
Booking required?:
Not required
Audience:
Members of the University only
Editor:
Anne Bowtell