ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1911.10367
17
0

A Sub-sampled Tensor Method for Non-convex Optimization

23 November 2019
Aurelien Lucchi
Jonas Köhler
ArXivPDFHTML
Abstract

We present a stochastic optimization method that uses a fourth-order regularized model to find local minima of smooth and potentially non-convex objective functions with a finite-sum structure. This algorithm uses sub-sampled derivatives instead of exact quantities. The proposed approach is shown to find an (ϵ1,ϵ2,ϵ3)(\epsilon_1,\epsilon_2,\epsilon_3)(ϵ1​,ϵ2​,ϵ3​)-third-order critical point in at most \bigO(max⁡(ϵ1−4/3,ϵ2−2,ϵ3−4))\bigO\left(\max\left(\epsilon_1^{-4/3}, \epsilon_2^{-2}, \epsilon_3^{-4}\right)\right)\bigO(max(ϵ1−4/3​,ϵ2−2​,ϵ3−4​)) iterations, thereby matching the rate of deterministic approaches. In order to prove this result, we derive a novel tensor concentration inequality for sums of tensors of any order that makes explicit use of the finite-sum structure of the objective function.

View on arXiv
Comments on this paper