ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2212.14439
24
1

An Optimal Algorithm for Strongly Convex Min-min Optimization

29 December 2022
Alexander Gasnikov
D. Kovalev
Grigory Malinovsky
ArXivPDFHTML
Abstract

In this paper we study the smooth strongly convex minimization problem min⁡xmin⁡yf(x,y)\min_{x}\min_y f(x,y)minx​miny​f(x,y). The existing optimal first-order methods require O(max⁡{κx,κy}log⁡1/ϵ)\mathcal{O}(\sqrt{\max\{\kappa_x,\kappa_y\}} \log 1/\epsilon)O(max{κx​,κy​}​log1/ϵ) of computations of both ∇xf(x,y)\nabla_x f(x,y)∇x​f(x,y) and ∇yf(x,y)\nabla_y f(x,y)∇y​f(x,y), where κx\kappa_xκx​ and κy\kappa_yκy​ are condition numbers with respect to variable blocks xxx and yyy. We propose a new algorithm that only requires O(κxlog⁡1/ϵ)\mathcal{O}(\sqrt{\kappa_x} \log 1/\epsilon)O(κx​​log1/ϵ) of computations of ∇xf(x,y)\nabla_x f(x,y)∇x​f(x,y) and O(κylog⁡1/ϵ)\mathcal{O}(\sqrt{\kappa_y} \log 1/\epsilon)O(κy​​log1/ϵ) computations of ∇yf(x,y)\nabla_y f(x,y)∇y​f(x,y). In some applications κx≫κy\kappa_x \gg \kappa_yκx​≫κy​, and computation of ∇yf(x,y)\nabla_y f(x,y)∇y​f(x,y) is significantly cheaper than computation of ∇xf(x,y)\nabla_x f(x,y)∇x​f(x,y). In this case, our algorithm substantially outperforms the existing state-of-the-art methods.

View on arXiv
Comments on this paper