Is Combining Classifiers with Stacking Better than Selecting the Best One?

Is Combining Classifiers with Stacking Better than Selecting the Best One?

2004 | SASO DŽEROSKI, BERNARD ŽENKO
This paper evaluates several state-of-the-art stacking methods for combining heterogeneous classifiers and compares their performance to selecting the best classifier via cross-validation. It shows that stacking methods perform comparably to selecting the best classifier, with stacking using probability distributions and multi-response linear regression (MLR) performing best. The authors propose two extensions: one using an extended set of meta-level features and another using multi-response model trees for meta-level learning. The latter extension outperforms existing stacking methods and selecting the best classifier. The study also shows that stacking with multi-response model trees (SMM5) performs better than other methods, including SelectBest. The results indicate that stacking with MLR and SMM5 are effective, but stacking methods do not consistently outperform selecting the best classifier. The paper highlights the importance of meta-level features and learning algorithms in stacking, and concludes that stacking with multi-response model trees is a good choice for meta-level learning.This paper evaluates several state-of-the-art stacking methods for combining heterogeneous classifiers and compares their performance to selecting the best classifier via cross-validation. It shows that stacking methods perform comparably to selecting the best classifier, with stacking using probability distributions and multi-response linear regression (MLR) performing best. The authors propose two extensions: one using an extended set of meta-level features and another using multi-response model trees for meta-level learning. The latter extension outperforms existing stacking methods and selecting the best classifier. The study also shows that stacking with multi-response model trees (SMM5) performs better than other methods, including SelectBest. The results indicate that stacking with MLR and SMM5 are effective, but stacking methods do not consistently outperform selecting the best classifier. The paper highlights the importance of meta-level features and learning algorithms in stacking, and concludes that stacking with multi-response model trees is a good choice for meta-level learning.
Reach us at info@futurestudyspace.com