ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.02101
130
0

Efficient Curvature-Aware Hypergradient Approximation for Bilevel Optimization

4 May 2025
Youran Dong
Junfeng Yang
Wei-Ting Yao
Jin Zhang
ArXivPDFHTML
Abstract

Bilevel optimization is a powerful tool for many machine learning problems, such as hyperparameter optimization and meta-learning. Estimating hypergradients (also known as implicit gradients) is crucial for developing gradient-based methods for bilevel optimization. In this work, we propose a computationally efficient technique for incorporating curvature information into the approximation of hypergradients and present a novel algorithmic framework based on the resulting enhanced hypergradient computation. We provide convergence rate guarantees for the proposed framework in both deterministic and stochastic scenarios, particularly showing improved computational complexity over popular gradient-based methods in the deterministic setting. This improvement in complexity arises from a careful exploitation of the hypergradient structure and the inexact Newton method. In addition to the theoretical speedup, numerical experiments demonstrate the significant practical performance benefits of incorporating curvature information.

View on arXiv
@article{dong2025_2505.02101,
  title={ Efficient Curvature-Aware Hypergradient Approximation for Bilevel Optimization },
  author={ Youran Dong and Junfeng Yang and Wei Yao and Jin Zhang },
  journal={arXiv preprint arXiv:2505.02101},
  year={ 2025 }
}
Comments on this paper