PT Journal AU Oriol Ramos Terrades Ernest Valveny Salvatore Tabbone TI Optimal Classifier Fusion in a Non-Bayesian Probabilistic Framework SO IEEE Transactions on Pattern Analysis and Machine Intelligence JI TPAMI PY 2009 BP 1630–1644 VL 31 IS 9 DI 10.1109/TPAMI.2008.224 AB The combination of the output of classifiers has been one of the strategies used to improve classification rates in general purpose classification systems. Some of the most common approaches can be explained using the Bayes' formula. In this paper, we tackle the problem of the combination of classifiers using a non-Bayesian probabilistic framework. This approach permits us to derive two linear combination rules that minimize misclassification rates under some constraints on the distribution of classifiers. In order to show the validity of this approach we have compared it with other popular combination rules from a theoretical viewpoint using a synthetic data set, and experimentally using two standard databases: the MNIST handwritten digit database and the GREC symbol database. Results on the synthetic data set show the validity of the theoretical approach. Indeed, results on real data show that the proposed methods outperform other common combination schemes. ER