Self-concordant analysis for logistic regression
Open Access
- 1 January 2010
- journal article
- research article
- Published by Institute of Mathematical Statistics in Electronic Journal of Statistics
- Vol. 4 (none), 384-414
- https://doi.org/10.1214/09-ejs521
Abstract
Most of the non-asymptotic theoretical work in regression is carried out for the square loss, where estimators can be obtained through closed-form expressions. In this paper, we use and extend tools from the convex optimization literature, namely self-concordant functions, to provide simple extensions of theoretical results for the square loss to the logistic loss. We apply the extension techniques to logistic regression with regularization by the ℓ2-norm and regularization by the ℓ1-norm, showing that new results for binary classification through logistic regression can be easily derived from corresponding results for least-squares regression.Keywords
This publication has 29 references indexed in Scilit:
- Some sharp performance bounds for least squares regression with L1 regularizationThe Annals of Statistics, 2009
- Simultaneous analysis of Lasso and Dantzig selectorThe Annals of Statistics, 2009
- High-dimensional generalized linear models and the lassoThe Annals of Statistics, 2008
- Honest variable selection in linear and logistic regression models via ℓ1 and ℓ1+ℓ2 penalizationElectronic Journal of Statistics, 2008
- The Adaptive Lasso and Its Oracle PropertiesJournal of the American Statistical Association, 2006
- Convexity, Classification, and Risk BoundsJournal of the American Statistical Association, 2006
- A new concentration result for regularized risk minimizersPublished by Institute of Mathematical Statistics ,2006
- Adaptive Spline Smoothing in Non-Gaussian Regression ModelsJournal of the American Statistical Association, 1990
- Automatic Smoothing of Regression Functions in Generalized Linear ModelsJournal of the American Statistical Association, 1986
- Some Comments onCpTechnometrics, 1973