%0 Journal Article %T Bagged One Class Classifiers in the Presence of Outliers %A Santiago Segui %A Laura Igual %A Jordi Vitria %J International Journal of Pattern Recognition and Artificial Intelligence %D 2013 %V 27 %N 5 %F Santiago Segui2013 %O OR; 600.046;MV %O exported from refbase (http://refbase.cvc.uab.es/show.php?record=2256), last updated on Thu, 21 Jan 2016 11:00:53 +0100 %X The problem of training classifiers only with target data arises in many applications where non-target data are too costly, difficult to obtain, or not available at all. Several one-class classification methods have been presented to solve this problem, but most of the methods are highly sensitive to the presence of outliers in the target class. Ensemble methods have therefore been proposed as a powerful way to improve the classification performance of binary/multi-class learning algorithms by introducing diversity into classifiers.However, their application to one-class classification has been rather limited. Inthis paper, we present a new ensemble method based on a non-parametric weighted bagging strategy for one-class classification, to improve accuracy in the presence of outliers. While the standard bagging strategy assumes a uniform data distribution, the method we propose here estimates a probability density based on a forest structure of the data. This assumption allows the estimation of data distribution from the computation of simple univariate and bivariate kernel densities. Experiments using original and noisy versions of 20 different datasets show that bagging ensemble methods applied to different one-class classifiers outperform base one-class classification methods. Moreover, we show that, in noisy versions of the datasets, the non-parametric weighted bagging strategy we propose outperforms the classical bagging strategy in a statistically significant way. %K One-class Classifier %K Ensemble Methods %K Bagging and Outliers %U http://refbase.cvc.uab.es/files/SIV2013.pdf %U http://dx.doi.org/10.1142/S0218001413500146 %P 1350014-1350035