ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2407.02279
  4. Cited By
How to Boost Any Loss Function

How to Boost Any Loss Function

2 July 2024
Richard Nock
Yishay Mansour
ArXivPDFHTML

Papers citing "How to Boost Any Loss Function"

3 / 3 papers shown
Title
Single Point-Based Distributed Zeroth-Order Optimization with a
  Non-Convex Stochastic Objective Function
Single Point-Based Distributed Zeroth-Order Optimization with a Non-Convex Stochastic Objective Function
Elissa Mhanna
Mohamad Assaad
53
4
0
08 Oct 2024
Zeroth-Order Hard-Thresholding: Gradient Error vs. Expansivity
Zeroth-Order Hard-Thresholding: Gradient Error vs. Expansivity
William de Vazelhes
Hualin Zhang
Huisi Wu
Xiao-Tong Yuan
Bin Gu
37
2
0
11 Oct 2022
Gradient-Free Methods for Deterministic and Stochastic Nonsmooth
  Nonconvex Optimization
Gradient-Free Methods for Deterministic and Stochastic Nonsmooth Nonconvex Optimization
Tianyi Lin
Zeyu Zheng
Michael I. Jordan
59
52
0
12 Sep 2022
1