ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2210.06723
14
34

Stochastic noise can be helpful for variational quantum algorithms

13 October 2022
Junyu Liu
Frederik Wilde
A. A. Mele
Liang Jiang
Jens Eisert
Jens Eisert
ArXivPDFHTML
Abstract

Saddle points constitute a crucial challenge for first-order gradient descent algorithms. In notions of classical machine learning, they are avoided for example by means of stochastic gradient descent methods. In this work, we provide evidence that the saddle points problem can be naturally avoided in variational quantum algorithms by exploiting the presence of stochasticity. We prove convergence guarantees and present practical examples in numerical simulations and on quantum hardware. We argue that the natural stochasticity of variational algorithms can be beneficial for avoiding strict saddle points, i.e., those saddle points with at least one negative Hessian eigenvalue. This insight that some levels of shot noise could help is expected to add a new perspective to notions of near-term variational quantum algorithms.

View on arXiv
@article{liu2025_2210.06723,
  title={ Stochastic noise can be helpful for variational quantum algorithms },
  author={ Junyu Liu and Frederik Wilde and Antonio Anna Mele and Xin Jin and Liang Jiang and Jens Eisert },
  journal={arXiv preprint arXiv:2210.06723},
  year={ 2025 }
}
Comments on this paper