We develop recursive, data-driven, stochastic subgradient methods for optimizing a new, versatile, and application-driven class of convex risk measures, termed here as mean-semideviations, strictly generalizing the well-known and popular mean-upper-semideviation. We introduce the MESSAGEp algorithm, which is an efficient compositional subgradient procedure for iteratively solving convex mean-semideviation risk-averse problems to optimality. We analyze the asymptotic behavior of the MESSAGEp algorithm under a flexible and structure-exploiting set of problem assumptions. In particular: 1) Under appropriate stepsize rules, we establish pathwise convergence of the MESSAGEp algorithm in a strong technical sense, confirming its asymptotic consistency. 2) Assuming a strongly convex cost, we show that, for fixed semideviation order and for , the MESSAGEp algorithm achieves a squared- solution suboptimality rate of the order of iterations, where, for , pathwise convergence is simultaneously guaranteed. This result establishes a rate of order arbitrarily close to , while ensuring strongly stable pathwise operation. For , the rate order improves to , which also suffices for pathwise convergence, and matches previous results. 3) Likewise, in the general case of a convex cost, we show that, for any , the MESSAGEp algorithm with iterate smoothing achieves an objective suboptimality rate of the order of iterations. This result provides maximal rates of , if , and , if , matching the state of the art, as well.
View on arXiv