ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2409.00306
33
1

Evolutionary Algorithms Are Significantly More Robust to Noise When They Ignore It

31 August 2024
Denis Antipov
Benjamin Doerr
ArXivPDFHTML
Abstract

Randomized search heuristics (RSHs) are known to have a certain robustness to noise. Mathematical analyses trying to quantify rigorously how robust RSHs are to a noisy access to the objective function typically assume that each solution is re-evaluated whenever it is compared to others. This aims at preventing that a single noisy evaluation has a lasting negative effect, but is computationally expensive and requires the user to foresee that noise is present (as in a noise-free setting, one would never re-evaluate solutions).In this work, we conduct the first mathematical runtime analysis of an evolutionary algorithm solving a single-objective noisy problem without re-evaluations. We prove that the (1+1)(1+1)(1+1) evolutionary algorithm without re-evaluations can optimize the classic LeadingOnes benchmark with up to constant noise rates, in sharp contrast to the version with re-evaluations, where only noise with rates O(n−2log⁡n)O(n^{-2} \log n)O(n−2logn) can be tolerated. This result suggests that re-evaluations are much less needed than what was previously thought, and that they actually can be highly detrimental. The insights from our mathematical proofs indicate that this similar results are plausible for other classic benchmarks.

View on arXiv
@article{antipov2025_2409.00306,
  title={ Evolutionary Algorithms Are Significantly More Robust to Noise When They Ignore It },
  author={ Denis Antipov and Benjamin Doerr },
  journal={arXiv preprint arXiv:2409.00306},
  year={ 2025 }
}
Comments on this paper