ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2402.06033
145
4
v1v2v3 (latest)

An Inexact Halpern Iteration with Application to Distributionally Robust Optimization

8 February 2024
Ling Liang
Zusen Xu
Kim-Chuan Toh
ArXiv (abs)PDFHTML
Main:23 Pages
2 Figures
Bibliography:4 Pages
1 Tables
Appendix:11 Pages
Abstract

The Halpern iteration for solving monotone inclusion problems has gained increasing interests in recent years due to its simple form and appealing convergence properties. In this paper, we investigate the inexact variants of the scheme in both deterministic and stochastic settings. We conduct extensive convergence analysis and show that by choosing the inexactness tolerances appropriately, the inexact schemes admit an O(k−1)O(k^{-1})O(k−1) convergence rate in terms of the (expected) residue norm. Our results relax the state-of-the-art inexactness conditions employed in the literature while sharing the same competitive convergence properties. We then demonstrate how the proposed methods can be applied for solving two classes of data-driven Wasserstein distributionally robust optimization problems that admit convex-concave min-max optimization reformulations. We highlight its capability of performing inexact computations for distributionally robust learning with stochastic first-order methods.

View on arXiv
@article{liang2025_2402.06033,
  title={ An Inexact Halpern Iteration with Application to Distributionally Robust Optimization },
  author={ Ling Liang and Zusen Xu and Kim-Chuan Toh and Jia-Jie Zhu },
  journal={arXiv preprint arXiv:2402.06033},
  year={ 2025 }
}
Comments on this paper