Titel
Optimal subgradient algorithms for large-scale convex optimization in simple domains
Abstract
This paper describes two optimal subgradient algorithms for solving structured large-scale convex constrained optimization. More specifically, the first algorithm is optimal for smooth problems with Lipschitz continuous gradients and for Lipschitz continuous nonsmooth problems, and the second algorithm is optimal for Lipschitz continuous nonsmooth problems. In addition, we consider two classes of problems: (i) a convex objective with a simple closed convex domain, where the orthogonal projection onto this feasible domain is efficiently available; and (ii) a convex objective with a simple convex functional constraint. If we equip our algorithms with an appropriate prox-function, then the associated subproblem can be solved either in a closed form or by a simple iterative scheme, which is especially important for large-scale problems. We report numerical results for some applications to show the efficiency of the proposed schemes.
Stichwort
Structured convex optimizationNonsmooth optimizationOptimal complexityFirst-order black-box informationSubgradient methodHigh-dimensional data
Objekt-Typ
Sprache
Englisch [eng]
Erschienen in
Titel
Numerical Algorithms
Band
76
Ausgabe
4
Seitenanfang
1071
Seitenende
1097
Publication
Springer Nature
Erscheinungsdatum
2017
Zugänglichkeit
Rechteangabe
© The Author(s)

Herunterladen

Universität Wien | Universitätsring 1 | 1010 Wien | T +43-1-4277-0