
arXiv: 2301.04204
In this paper we consider finding an approximate second-order stationary point (SOSP) of general nonconvex conic optimization that minimizes a twice differentiable function subject to nonlinear equality constraints and also a convex conic constraint. In particular, we propose a Newton-conjugate gradient (Newton-CG) based barrier-augmented Lagrangian method for finding an approximate SOSP of this problem. Under some mild assumptions, we show that our method enjoys a total inner iteration complexity of $\widetilde{\cal O}(ε^{-11/2})$ and an operation complexity of $\widetilde{\cal O}(ε^{-11/2}\min\{n,ε^{-5/4}\})$ for finding an $(ε,\sqrtε)$-SOSP of general nonconvex conic optimization with high probability. Moreover, under a constraint qualification, these complexity bounds are improved to $\widetilde{\cal O}(ε^{-7/2})$ and $\widetilde{\cal O}(ε^{-7/2}\min\{n,ε^{-3/4}\})$, respectively. To the best of our knowledge, this is the first study on the complexity of finding an approximate SOSP of general nonconvex conic optimization. Preliminary numerical results are presented to demonstrate superiority of the proposed method over first-order methods in terms of solution quality.
To appear in Computational Optimization and Applications. arXiv admin note: text overlap with arXiv:2301.03139
FOS: Computer and information sciences, Computer Science - Machine Learning, operation complexity, Machine Learning (stat.ML), Interior-point methods, Nonconvex programming, global optimization, Machine Learning (cs.LG), second-order stationary point, Numerical mathematical programming methods, 49M05, 49M15, 68Q25, 90C26, 90C30, 90C60, Nonlinear programming, Statistics - Machine Learning, nonconvex conic optimization, FOS: Mathematics, Mathematics - Numerical Analysis, Mathematics - Optimization and Control, Newton-conjugate gradient method, iteration complexity, Numerical Analysis (math.NA), Methods of quasi-Newton type, Optimization and Control (math.OC), barrier method, augmented Lagrangian method
FOS: Computer and information sciences, Computer Science - Machine Learning, operation complexity, Machine Learning (stat.ML), Interior-point methods, Nonconvex programming, global optimization, Machine Learning (cs.LG), second-order stationary point, Numerical mathematical programming methods, 49M05, 49M15, 68Q25, 90C26, 90C30, 90C60, Nonlinear programming, Statistics - Machine Learning, nonconvex conic optimization, FOS: Mathematics, Mathematics - Numerical Analysis, Mathematics - Optimization and Control, Newton-conjugate gradient method, iteration complexity, Numerical Analysis (math.NA), Methods of quasi-Newton type, Optimization and Control (math.OC), barrier method, augmented Lagrangian method
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
