ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2211.01832
21
12

Extra-Newton: A First Approach to Noise-Adaptive Accelerated Second-Order Methods

3 November 2022
Kimon Antonakopoulos
Ali Kavis
V. Cevher
    ODL
ArXivPDFHTML
Abstract

This work proposes a universal and adaptive second-order method for minimizing second-order smooth, convex functions. Our algorithm achieves O(σ/T)O(\sigma / \sqrt{T})O(σ/T​) convergence when the oracle feedback is stochastic with variance σ2\sigma^2σ2, and improves its convergence to O(1/T3)O( 1 / T^3)O(1/T3) with deterministic oracles, where TTT is the number of iterations. Our method also interpolates these rates without knowing the nature of the oracle apriori, which is enabled by a parameter-free adaptive step-size that is oblivious to the knowledge of smoothness modulus, variance bounds and the diameter of the constrained set. To our knowledge, this is the first universal algorithm with such global guarantees within the second-order optimization literature.

View on arXiv
Comments on this paper