SVM vs regularized least squares classification
- 1 January 2004
- conference paper
- conference paper
- Published by Institute of Electrical and Electronics Engineers (IEEE) in Proceedings of the 17th International Conference on Pattern Recognition, 2004. ICPR 2004.
- Vol. 1, 176-179 Vol.1
- https://doi.org/10.1109/icpr.2004.1334050
Abstract
Support vector machines (SVMs) and regularized least squares (RLS) are two recent promising techniques for classification. SVMs implement the structure risk minimization principle and use the kernel trick to extend it to the nonlinear case. On the one hand, RLS minimizes a regularized functional directly in a reproducing kernel Hilbert space defined by a kernel. While both have a sound mathematical foundation, RLS is strikingly simple. On the other hand, SVMs in general have a sparse representation of solutions. In addition, the performance of SVMs has been well documented but little can be said of RLS. This paper applies these two techniques to a collection of data sets and presents results demonstrating virtual identical performance by the two methods.Keywords
This publication has 5 references indexed in Scilit:
- On the mathematical foundations of learningBulletin of the American Mathematical Society, 2001
- An Introduction to Support Vector Machines and Other Kernel-based Learning MethodsPublished by Cambridge University Press (CUP) ,2000
- Input space versus feature space in kernel-based methodsIEEE Transactions on Neural Networks, 1999
- A Tutorial on Support Vector Machines for Pattern RecognitionData Mining and Knowledge Discovery, 1998
- On the Asymptotic Complexity of Matrix MultiplicationSIAM Journal on Computing, 1982