Combining bagging, boosting, rotation forest and random subspace methods
- 21 December 2010
- journal article
- Published by Springer Science and Business Media LLC in Artificial Intelligence Review
- Vol. 35 (3), 223-240
- https://doi.org/10.1007/s10462-010-9192-8
Abstract
No abstract availableKeywords
This publication has 28 references indexed in Scilit:
- A weighted subspace approach for improving bagging performancePublished by Institute of Electrical and Electronics Engineers (IEEE) ,2008
- On bagging and nonlinear estimationJournal of Statistical Planning and Inference, 2006
- Attribute bagging: improving accuracy of classifier ensembles by using random feature subsetsPattern Recognition, 2002
- Analyzing baggingThe Annals of Statistics, 2002
- Random ForestsMachine Learning, 2001
- An Experimental Comparison of Three Methods for Constructing Ensembles of Decision Trees: Bagging, Boosting, and RandomizationMachine Learning, 2000
- An Empirical Comparison of Voting Classification Algorithms: Bagging, Boosting, and VariantsMachine Learning, 1999
- Memory-based morphological analysisPublished by Association for Computational Linguistics (ACL) ,1999
- A Decision-Theoretic Generalization of On-Line Learning and an Application to BoostingJournal of Computer and System Sciences, 1997
- On the Optimality of the Simple Bayesian Classifier under Zero-One LossMachine Learning, 1997