ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2002.03001
22
17

A Novel Evolution Strategy with Directional Gaussian Smoothing for Blackbox Optimization

7 February 2020
Jiaxin Zhang
Hoang Tran
Dan Lu
Guannan Zhang
ArXivPDFHTML
Abstract

We propose an improved evolution strategy (ES) using a novel nonlocal gradient operator for high-dimensional black-box optimization. Standard ES methods with ddd-dimensional Gaussian smoothing suffer from the curse of dimensionality due to the high variance of Monte Carlo (MC) based gradient estimators. To control the variance, Gaussian smoothing is usually limited in a small region, so existing ES methods lack nonlocal exploration ability required for escaping from local minima. We develop a nonlocal gradient operator with directional Gaussian smoothing (DGS) to address this challenge. The DGS conducts 1D nonlocal explorations along ddd orthogonal directions in Rd\mathbb{R}^dRd, each of which defines a nonlocal directional derivative as a 1D integral. We then use Gauss-Hermite quadrature, instead of MC sampling, to estimate the ddd 1D integrals to ensure high accuracy (i.e., small variance). Our method enables effective nonlocal exploration to facilitate the global search in high-dimensional optimization. We demonstrate the superior performance of our method in three sets of examples, including benchmark functions for global optimization, and real-world science and engineering applications.

View on arXiv
Comments on this paper