Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
Top Cited Papers
- 1 January 2012
- journal article
- Published by Society for Industrial & Applied Mathematics (SIAM) in SIAM Journal on Optimization
- Vol. 22 (2), 341-362
- https://doi.org/10.1137/100802001
Abstract
In this paper we propose new methods for solving huge-scale optimization problems. For problems of this size, even the simplest full-dimensional vector operations are very expensive. Hence, we propose to apply an optimization technique based on random partial update of decision variables. For these methods, we prove the global estimates for the rate of convergence. Surprisingly, for certain classes of objective functions, our results are better than the standard worst-case bounds for deterministic algorithms. We present constrained and unconstrained versions of the method and its accelerated variant. Our numerical test confirms a high efficiency of this technique on problems of very big sizeKeywords
This publication has 4 references indexed in Scilit:
- Randomized Methods for Linear Constraints: Convergence Rates and ConditioningMathematics of Operations Research, 2010
- Convergence of a Block Coordinate Descent Method for Nondifferentiable MinimizationJournal of Optimization Theory and Applications, 2001
- On the Convergence Rate of Dual Ascent Methods for Linearly Constrained Convex MinimizationMathematics of Operations Research, 1993
- On the convergence of the coordinate descent method for convex differentiable minimizationJournal of Optimization Theory and Applications, 1992