ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2407.17216
44
0

Alternating Iteratively Reweighted ℓ1\ell_1ℓ1​ and Subspace Newton Algorithms for Nonconvex Sparse Optimization

24 July 2024
Hao Wang
Xiangyu Yang
Yichen Zhu
ArXivPDFHTML
Abstract

This paper presents a novel hybrid algorithm for minimizing the sum of a continuously differentiable loss function and a nonsmooth, possibly nonconvex, sparse regularization function. The proposed method alternates between solving a reweighted ℓ1\ell_1ℓ1​-regularized subproblem and performing an inexact subspace Newton step. The reweighted ℓ1\ell_1ℓ1​-subproblem allows for efficient closed-form solutions via the soft-thresholding operator, avoiding the computational overhead of proximity operator calculations. As the algorithm approaches an optimal solution, it maintains a stable support set, ensuring that nonzero components stay uniformly bounded away from zero. It then switches to a perturbed regularized Newton method, further accelerating the convergence. We prove global convergence to a critical point and, under suitable conditions, demonstrate that the algorithm exhibits local linear and quadratic convergence rates. Numerical experiments show that our algorithm outperforms existing methods in both efficiency and solution quality across various model prediction problems.

View on arXiv
@article{wang2025_2407.17216,
  title={ Alternating Iteratively Reweighted $\ell_1$ and Subspace Newton Algorithms for Nonconvex Sparse Optimization },
  author={ Hao Wang and Xiangyu Yang and Yichen Zhu },
  journal={arXiv preprint arXiv:2407.17216},
  year={ 2025 }
}
Comments on this paper