ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1604.03887
12
50

Algorithms for stochastic optimization with functional or expectation constraints

13 April 2016
Guanghui Lan
Zhiqiang Zhou
ArXivPDFHTML
Abstract

This paper considers the problem of minimizing an expectation function over a closed convex set, coupled with a {\color{black} functional or expectation} constraint on either decision variables or problem parameters. We first present a new stochastic approximation (SA) type algorithm, namely the cooperative SA (CSA), to handle problems with the constraint on devision variables. We show that this algorithm exhibits the optimal O(1/ϵ2){\cal O}(1/\epsilon^2)O(1/ϵ2) rate of convergence, in terms of both optimality gap and constraint violation, when the objective and constraint functions are generally convex, where ϵ\epsilonϵ denotes the optimality gap and infeasibility. Moreover, we show that this rate of convergence can be improved to O(1/ϵ){\cal O}(1/\epsilon)O(1/ϵ) if the objective and constraint functions are strongly convex. We then present a variant of CSA, namely the cooperative stochastic parameter approximation (CSPA) algorithm, to deal with the situation when the constraint is defined over problem parameters and show that it exhibits similar optimal rate of convergence to CSA. It is worth noting that CSA and CSPA are primal methods which do not require the iterations on the dual space and/or the estimation on the size of the dual variables. To the best of our knowledge, this is the first time that such optimal SA methods for solving functional or expectation constrained stochastic optimization are presented in the literature.

View on arXiv
Comments on this paper