An aggregate subgradient method for nonsmooth convex minimization
- 1 October 1983
- journal article
- Published by Springer Science and Business Media LLC in Mathematical Programming
- Vol. 27 (3), 320-341
- https://doi.org/10.1007/bf02591907
Abstract
A class of implementable algorithms is described for minimizing any convex, not necessarily differentiable, functionf of several variables. The methods require only the calculation off and one subgradient off at designated points. They generalize Lemarechal's bundle method. More specifically, instead of using all previously computed subgradients in search direction finding subproblems that are quadratic programming problems, the methods use an aggregate subgradient which is recursively updated as the algorithms proceed. Each algorithm yields a minimizing sequence of points, and iff has any minimizers, then this sequence converges to a solution of the problem. Particular members of this algorithm class terminate whenf is piecewise linear. The methods are easy to implement and have flexible storage requirements and computational effort per iteration that can be controlled by a user.Keywords
This publication has 7 references indexed in Scilit:
- A modification and an extension of Lemarechal’s algorithm for nonsmooth minimizationPublished by Springer Science and Business Media LLC ,1982
- An Algorithm for Constrained Optimization with Semismooth FunctionsMathematics of Operations Research, 1977
- Quasi-Newton Methods, Motivation and TheorySiam Review, 1977
- Finding the nearest point in A polytopeMathematical Programming, 1976
- The Boxstep Method for Large-Scale OptimizationOperations Research, 1975
- A method of conjugate subgradients for minimizing nondifferentiable functionsPublished by Springer Science and Business Media LLC ,1975
- The Cutting-Plane Method for Solving Convex ProgramsJournal of the Society for Industrial and Applied Mathematics, 1960