ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1109.0258
40
11
v1v2 (latest)

Nonconvex proximal splitting: batch and incremental algorithms

1 September 2011
S. Sra
ArXiv (abs)PDFHTML
Abstract

Within the unmanageably large class of nonconvex optimization, we consider the rich subclass of nonsmooth problems that have \emph{composite} objectives---this already includes the extensively studied convex, composite objective problems as a special case. For this subclass, we introduce a powerful, new framework that permits asymptotically \emph{non-vanishing} perturbations. In particular, we develop perturbation-based batch and incremental (online like) nonconvex proximal splitting algorithms. To our knowledge, this is the first time that such perturbation-based nonconvex splitting algorithms are being proposed and analyzed. While the main contribution of the paper is the theoretical framework, we complement our results by presenting some empirical results on matrix factorization.

View on arXiv
Comments on this paper