In this paper we study the smooth strongly convex minimization problem
minxminyf(x,y). The existing optimal first-order methods require
O(max{κx,κy}log1/ϵ) of computations
of both ∇xf(x,y) and ∇yf(x,y), where κx and
κy are condition numbers with respect to variable blocks x and y.
We propose a new algorithm that only requires O(κxlog1/ϵ) of computations of ∇xf(x,y) and
O(κylog1/ϵ) computations of ∇yf(x,y). In some applications κx≫κy, and computation of
∇yf(x,y) is significantly cheaper than computation of ∇xf(x,y). In this case, our algorithm substantially outperforms the existing
state-of-the-art methods.