Geometrical Properties of Nu Support Vector Machines with Different Norms
- 1 November 2005
- journal article
- research article
- Published by MIT Press in Neural Computation
- Vol. 17 (11), 2508-2529
- https://doi.org/10.1162/0899766054796897
Abstract
By employing the L1 or L∞ norms in maximizing margins, support vector machines (SVMs) result in a linear programming problem that requires a lower computational load compared to SVMs with the L2 norm. However, how the change of norm affects the generalization ability of SVMs has not been clarified so far except for numerical experiments. In this letter, the geometrical meaning of SVMs with the Lp norm is investigated, and the SVM solutions are shown to have rather little dependency on p.Keywords
This publication has 8 references indexed in Scilit:
- An asymptotic statistical analysis of support vector machines with soft marginsNeural Networks, 2005
- An Asymptotic Statistical Theory of Polynomial Kernel MethodsNeural Computation, 2004
- Geometry and learning curves of kernel methods with polynomial kernelsSystems and Computers in Japan, 2004
- A geometric approach to support vector regressionNeurocomputing, 2003
- Support vector machines with different norms: motivation, formulations and resultsPattern Recognition Letters, 2001
- New Support Vector AlgorithmsNeural Computation, 2000
- Arbitrary-norm separating planeOperations Research Letters, 1999
- Support-vector networksMachine Learning, 1995