TY - JOUR
T1 - Newton-Type Alternating Minimization Algorithm for Convex Optimization
AU - Stella, Lorenzo
AU - Themelis, Andreas
AU - Patrinos, Panagiotis
N1 - Funding Information:
Manuscript received March 3, 2017; revised October 3, 2017; accepted March 31, 2018. Date of publication September 26, 2018; date of current version January 28, 2019. This work was supported by KU Leu-ven internal funding: StG/15/043 Fonds de la Recherche Scientifique— FNRS and the Fonds Wetenschappelijk Onderzoek—Vlaanderen under EOS Project 30468160 (SeLMA) FWO projects G086318N and G086518N. Recommended by Associate Editor D. Regruto. (Corresponding author: Lorenzo Stella.) L. Stella and A. Themelis are with the Department of Electrical Engineering, Stadius Centre for Dynamical Systems, Signal Processing and Data Analytics, and the Optimization in Engineering Center, KU Leu-ven, Leuven 3001, Belgium, and also with the IMT School for Advanced Studies Lucca, Lucca 55100, Italy (e-mail:, lorenzostella@gmail.com; andreas.themelis@esat.kuleuven.be).
Publisher Copyright:
© 1963-2012 IEEE.
PY - 2019/2
Y1 - 2019/2
N2 - We propose a Newton-type alternating minimization algorithm (NAMA) for solving structured nonsmooth convex optimization problems where the sum of two functions is to be minimized, one being strongly convex and the other composed with a linear mapping. The proposed algorithm is a line-search method over a continuous, real-valued, exact penalty function for the corresponding dual problem, which is computed by evaluating the augmented Lagrangian at the primal points obtained by alternating minimizations. As a consequence, NAMA relies on exactly the same computations as the classical alternating minimization algorithm (AMA), also known as the dual-proximal gradient method. Under standard assumptions, the proposed algorithm converges with global sublinear and local linear rates, while under mild additional assumptions, the asymptotic convergence is superlinear, provided that the search directions are chosen according to quasi-Newton formulas. Due to its simplicity, the proposed method is well suited for embedded applications and large-scale problems. Experiments show that using limited-memory directions in NAMA greatly improves the convergence speed over AMA and its accelerated variant.
AB - We propose a Newton-type alternating minimization algorithm (NAMA) for solving structured nonsmooth convex optimization problems where the sum of two functions is to be minimized, one being strongly convex and the other composed with a linear mapping. The proposed algorithm is a line-search method over a continuous, real-valued, exact penalty function for the corresponding dual problem, which is computed by evaluating the augmented Lagrangian at the primal points obtained by alternating minimizations. As a consequence, NAMA relies on exactly the same computations as the classical alternating minimization algorithm (AMA), also known as the dual-proximal gradient method. Under standard assumptions, the proposed algorithm converges with global sublinear and local linear rates, while under mild additional assumptions, the asymptotic convergence is superlinear, provided that the search directions are chosen according to quasi-Newton formulas. Due to its simplicity, the proposed method is well suited for embedded applications and large-scale problems. Experiments show that using limited-memory directions in NAMA greatly improves the convergence speed over AMA and its accelerated variant.
UR - http://www.scopus.com/inward/record.url?scp=85054210100&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85054210100&partnerID=8YFLogxK
U2 - 10.1109/TAC.2018.2872203
DO - 10.1109/TAC.2018.2872203
M3 - Article
AN - SCOPUS:85054210100
SN - 0018-9286
VL - 64
SP - 697
EP - 711
JO - IRE Transactions on Automatic Control
JF - IRE Transactions on Automatic Control
IS - 2
M1 - 8472357
ER -