Convergence Rate of the (1+1)-Evolution Strategy with Success-Based Step-Size Adaptation on Convex Quadratic Functions

The (1+1)-evolution strategy (ES) with success-based step-size adaptation is analyzed on a general convex quadratic function and its monotone transformation, that is, , where is a strictly increasing function, is a positive-definite symmetric matrix, and is the optimal solution of . The convergence rate, that is, the decrease rate of the distance from a search point to the optimal solution , is proven to be in , where is the smallest eigenvalue of and is the trace of . This result generalizes the known rate of for the case of ( is the identity matrix of dimension ) and for the case of . To the best of our knowledge, this is the first study in which the convergence rate of the (1+1)-ES is derived explicitly and rigorously on a general convex quadratic function, which depicts the impact of the distribution of the eigenvalues in the Hessian on the optimization and not only the impact of the condition number of .
View on arXiv