Boosting random subspace method
- 30 November 2008
- journal article
- Published by Elsevier BV in Neural Networks
- Vol. 21 (9), 1344-1362
- https://doi.org/10.1016/j.neunet.2007.12.046
Abstract
No abstract availableKeywords
This publication has 28 references indexed in Scilit:
- Is Combining Classifiers with Stacking Better than Selecting the Best One?Machine Learning, 2004
- Online Ensemble Learning: An Empirical StudyMachine Learning, 2003
- Random ForestsMachine Learning, 2001
- Ensemble Methods in Machine LearningLecture Notes in Computer Science, 2000
- An Experimental Comparison of Three Methods for Constructing Ensembles of Decision Trees: Bagging, Boosting, and RandomizationMachine Learning, 2000
- An Empirical Comparison of Voting Classification Algorithms: Bagging, Boosting, and VariantsMachine Learning, 1999
- Approximate Statistical Tests for Comparing Supervised Classification Learning AlgorithmsNeural Computation, 1998
- Methods of Combining Multiple Classifiers with Different Features and Their Applications to Text-Independent Speaker IdentificationInternational Journal of Pattern Recognition and Artificial Intelligence, 1997
- Bagging predictorsMachine Learning, 1996
- Stacked regressionsMachine Learning, 1996