ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1804.00636
26
21
v1v2v3v4v5 (latest)

Recursive Optimization of Convex Risk Measures: Mean-Semideviation Models

2 April 2018
Dionysios S. Kalogerias
Warrren B Powell
ArXiv (abs)PDFHTML
Abstract

We develop recursive, data-driven, stochastic subgradient methods for optimizing a new, versatile, and application-driven class of convex risk measures, termed here as mean-semideviations, strictly generalizing the well-known and popular mean-upper-semideviation. We introduce the MESSAGEp algorithm, which is an efficient compositional subgradient procedure for iteratively solving convex mean-semideviation risk-averse problems to optimality. We analyze the asymptotic behavior of the MESSAGEp algorithm under a flexible and structure-exploiting set of problem assumptions. In particular: 1) Under appropriate stepsize rules, we establish pathwise convergence of the MESSAGEp algorithm in a strong technical sense, confirming its asymptotic consistency. 2) Assuming a strongly convex cost, we show that, for fixed semideviation order p>1p>1p>1 and for ϵ∈[0,1)\epsilon\in\left[0,1\right)ϵ∈[0,1), the MESSAGEp algorithm achieves a squared-L2{\cal L}_{2}L2​ solution suboptimality rate of the order of O(n−(1−ϵ)/2){\cal O}(n^{-\left(1-\epsilon\right)/2})O(n−(1−ϵ)/2) iterations, where, for ϵ>0\epsilon>0ϵ>0, pathwise convergence is simultaneously guaranteed. This result establishes a rate of order arbitrarily close to O(n−1/2){\cal O}(n^{-1/2})O(n−1/2), while ensuring strongly stable pathwise operation. For p≡1p\equiv1p≡1, the rate order improves to O(n−2/3){\cal O}(n^{-2/3})O(n−2/3), which also suffices for pathwise convergence, and matches previous results. 3) Likewise, in the general case of a convex cost, we show that, for any ϵ∈[0,1)\epsilon\in\left[0,1\right)ϵ∈[0,1), the MESSAGEp algorithm with iterate smoothing achieves an L1{\cal L}_{1}L1​ objective suboptimality rate of the order of O(n−(1−ϵ)/(41{p>1}+4)){\cal O}(n^{-\left(1-\epsilon\right)/\left(4\bf{1}_{\left\{ p>1\right\} }+4\right)})O(n−(1−ϵ)/(41{p>1}​+4)) iterations. This result provides maximal rates of O(n−1/4){\cal O}(n^{-1/4})O(n−1/4), if p≡1p\equiv1p≡1, and O(n−1/8){\cal O}(n^{-1/8})O(n−1/8), if p>1p>1p>1, matching the state of the art, as well.

View on arXiv
Comments on this paper