Training v-Support Vector Regression: Theory and Algorithms
- 1 August 2002
- journal article
- Published by MIT Press in Neural Computation
- Vol. 14 (8), 1959-1977
- https://doi.org/10.1162/089976602760128081
Abstract
We discuss the relation betweenɛ-support vector regression (ɛ-SVR) and v-support vector regression (v-SVR). In particular, we focus on properties that are different from those of C-support vector classification (C-SVC) andv-support vector classification (v-SVC). We then discuss some issues that do not occur in the case of classification: the possible range of ɛ and the scaling of target values. A practical decomposition method forv-SVR is implemented, and computational experiments are conducted. We show some interesting numerical observations specific to regression.Keywords
This publication has 8 references indexed in Scilit:
- Convergence of a Generalized SMO Algorithm for SVM Classifier DesignMachine Learning, 2002
- A Simple Decomposition Method for Support Vector MachinesMachine Learning, 2002
- Feasible Direction Decomposition Algorithms for Training Support Vector MachinesMachine Learning, 2002
- Efficient SVM Regression Training with SMOMachine Learning, 2002
- On the convergence of the decomposition method for support vector machinesIEEE Transactions on Neural Networks, 2001
- Training v-Support Vector Classifiers: Theory and AlgorithmsNeural Computation, 2001
- New Support Vector AlgorithmsNeural Computation, 2000
- Improvements to the SMO algorithm for SVM regressionIEEE Transactions on Neural Networks, 2000