
arXiv: 2009.08647
Evolution strategies (ESs) are zeroth-order stochastic black-box optimization heuristics invariant to monotonic transformations of the objective function. They evolve a multivariate normal distribution, from which candidate solutions are generated. Among different variants, CMA-ES is nowadays recognized as one of the state-of-the-art zeroth-order optimizers for difficult problems. Albeit ample empirical evidence that ESs with a step-size control mechanism converge linearly, theoretical guarantees of linear convergence of ESs have been established only on limited classes of functions. In particular, theoretical results on convex functions are missing, where zeroth-order and also first-order optimization methods are often analyzed. In this paper, we establish almost sure linear convergence and a bound on the expected hitting time of an \new{ES family, namely the $(1+1)_κ$-ES, which includes the (1+1)-ES with (generalized) one-fifth success rule} and an abstract covariance matrix adaptation with bounded condition number, on a broad class of functions. The analysis holds for monotonic transformations of positively homogeneous functions and of quadratically bounded functions, the latter of which particularly includes monotonic transformation of strongly convex functions with Lipschitz continuous gradient. As far as the authors know, this is the first work that proves linear convergence of ES on such a broad class of functions.
SIAM Journal on Optimization (Accepted)
Stochastic Algorithms, Convex programming, randomized derivative free optimization, linear convergence, Numerical Analysis (math.NA), [INFO.INFO-NA]Computer Science [cs]/Numerical Analysis [cs.NA], Nonconvex programming, global optimization, Approximation methods and heuristics in mathematical programming, 004, 510, Randomized Derivative Free Optimization, stochastic algorithms, Evolution strategies, Numerical mathematical programming methods, Black-box optimization, [INFO.INFO-NA] Computer Science [cs]/Numerical Analysis [cs.NA], Linear Convergence, FOS: Mathematics, Derivative-free methods and methods using generalized derivatives, evolution strategies, Mathematics - Numerical Analysis, black-box optimization
Stochastic Algorithms, Convex programming, randomized derivative free optimization, linear convergence, Numerical Analysis (math.NA), [INFO.INFO-NA]Computer Science [cs]/Numerical Analysis [cs.NA], Nonconvex programming, global optimization, Approximation methods and heuristics in mathematical programming, 004, 510, Randomized Derivative Free Optimization, stochastic algorithms, Evolution strategies, Numerical mathematical programming methods, Black-box optimization, [INFO.INFO-NA] Computer Science [cs]/Numerical Analysis [cs.NA], Linear Convergence, FOS: Mathematics, Derivative-free methods and methods using generalized derivatives, evolution strategies, Mathematics - Numerical Analysis, black-box optimization
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 7 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Top 10% | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Top 10% |
