ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1806.00125
6
12

Accelerating Incremental Gradient Optimization with Curvature Information

31 May 2018
Hoi-To Wai
Wei Shi
César A. Uribe
A. Nedić
Anna Scaglione
ArXivPDFHTML
Abstract

This paper studies an acceleration technique for incremental aggregated gradient ({\sf IAG}) method through the use of \emph{curvature} information for solving strongly convex finite sum optimization problems. These optimization problems of interest arise in large-scale learning applications. Our technique utilizes a curvature-aided gradient tracking step to produce accurate gradient estimates incrementally using Hessian information. We propose and analyze two methods utilizing the new technique, the curvature-aided IAG ({\sf CIAG}) method and the accelerated CIAG ({\sf A-CIAG}) method, which are analogous to gradient method and Nesterov's accelerated gradient method, respectively. Setting κ\kappaκ to be the condition number of the objective function, we prove the RRR linear convergence rates of 1−4c0κ(κ+1)21 - \frac{4c_0 \kappa}{(\kappa+1)^2}1−(κ+1)24c0​κ​ for the {\sf CIAG} method, and 1−c12κ1 - \sqrt{\frac{c_1}{2\kappa}}1−2κc1​​​ for the {\sf A-CIAG} method, where c0,c1≤1c_0,c_1 \leq 1c0​,c1​≤1 are constants inversely proportional to the distance between the initial point and the optimal solution. When the initial iterate is close to the optimal solution, the RRR linear convergence rates match with the gradient and accelerated gradient method, albeit {\sf CIAG} and {\sf A-CIAG} operate in an incremental setting with strictly lower computation complexity. Numerical experiments confirm our findings. The source codes used for this paper can be found on \url{http://github.com/hoitowai/ciag/}.

View on arXiv
Comments on this paper