|
David Masip, Agata Lapedriza, & Jordi Vitria. (2009). Boosted Online Learning for Face Recognition. TSMCB - IEEE Transactions on Systems, Man and Cybernetics part B, 39(2), 530–538.
Abstract: Face recognition applications commonly suffer from three main drawbacks: a reduced training set, information lying in high-dimensional subspaces, and the need to incorporate new people to recognize. In the recent literature, the extension of a face classifier in order to include new people in the model has been solved using online feature extraction techniques. The most successful approaches of those are the extensions of the principal component analysis or the linear discriminant analysis. In the current paper, a new online boosting algorithm is introduced: a face recognition method that extends a boosting-based classifier by adding new classes while avoiding the need of retraining the classifier each time a new person joins the system. The classifier is learned using the multitask learning principle where multiple verification tasks are trained together sharing the same feature space. The new classes are added taking advantage of the structure learned previously, being the addition of new classes not computationally demanding. The present proposal has been (experimentally) validated with two different facial data sets by comparing our approach with the current state-of-the-art techniques. The results show that the proposed online boosting algorithm fares better in terms of final accuracy. In addition, the global performance does not decrease drastically even when the number of classes of the base problem is multiplied by eight.
|
|
|
David Masip, & Jordi Vitria. (2006). Boosted discriminant projections for nearest neighbor classification. Pattern Recognition, 39(2): 164–170.
|
|
|
Jordi Vitria, M. Bressan, & Petia Radeva. (2006). Bayesian classification of cork stoppers using class-conditional independent component analysis. IEEE Transactions on Systems, Man and Cybernetics (Part C), 36(6).
|
|
|
Jordi Vitria, M. Bressan, & Petia Radeva. (2007). Bayesian classification of cork stoppers using class-conditional independent component analysis. IEEE Transactions on Systems, Man and Cybernetics (Part C), 37(1): 32–38 (ISI 0,482).
|
|
|
Santiago Segui, Laura Igual, & Jordi Vitria. (2013). Bagged One Class Classifiers in the Presence of Outliers. IJPRAI - International Journal of Pattern Recognition and Artificial Intelligence, 27(5), 1350014–1350035.
Abstract: The problem of training classifiers only with target data arises in many applications where non-target data are too costly, difficult to obtain, or not available at all. Several one-class classification methods have been presented to solve this problem, but most of the methods are highly sensitive to the presence of outliers in the target class. Ensemble methods have therefore been proposed as a powerful way to improve the classification performance of binary/multi-class learning algorithms by introducing diversity into classifiers.
However, their application to one-class classification has been rather limited. In
this paper, we present a new ensemble method based on a non-parametric weighted bagging strategy for one-class classification, to improve accuracy in the presence of outliers. While the standard bagging strategy assumes a uniform data distribution, the method we propose here estimates a probability density based on a forest structure of the data. This assumption allows the estimation of data distribution from the computation of simple univariate and bivariate kernel densities. Experiments using original and noisy versions of 20 different datasets show that bagging ensemble methods applied to different one-class classifiers outperform base one-class classification methods. Moreover, we show that, in noisy versions of the datasets, the non-parametric weighted bagging strategy we propose outperforms the classical bagging strategy in a statistically significant way.
Keywords: One-class Classifier; Ensemble Methods; Bagging and Outliers
|
|