Fast Generalized Cross-Validation Algorithm for Sparse Model Learning
- 1 January 2007
- journal article
- Published by MIT Press in Neural Computation
- Vol. 19 (1), 283-301
- https://doi.org/10.1162/neco.2007.19.1.283
Abstract
We propose a fast, incremental algorithm for designing linear regression models. The proposed algorithm generates a sparse model by optimizing multiple smoothing parameters using the generalized cross-validation approach. The performances on synthetic and real-world data sets are compared with other incremental algorithms such as Tipping and Faul's fast relevance vector machine, Chen et al.'s orthogonal least squares, and Orr's regularized forward selection. The results demonstrate that the proposed algorithm is competitive.Keywords
This publication has 10 references indexed in Scilit:
- Sparse On-Line Gaussian ProcessesNeural Computation, 2002
- Predictive Approaches for Choosing Hyperparameters in Gaussian ProcessesNeural Computation, 2001
- 10.1162/15324430152748236Applied Physics Letters, 2000
- Support-vector networksMachine Learning, 1995
- Regularization in the Selection of Radial Basis Function CentersNeural Computation, 1995
- Multivariate Adaptive Regression SplinesThe Annals of Statistics, 1991
- Orthogonal least squares learning algorithm for radial basis function networksIEEE Transactions on Neural Networks, 1991
- Generalized Cross-Validation as a Method for Choosing a Good Ridge ParameterTechnometrics, 1979
- Ridge Regression: Biased Estimation for Nonorthogonal ProblemsTechnometrics, 1970
- Ridge Regression: Biased Estimation for Nonorthogonal ProblemsTechnometrics, 1970