ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2309.09239
18
0

Globally Convergent Accelerated Algorithms for Multilinear Sparse Logistic Regression with ℓ0\ell_0ℓ0​-constraints

17 September 2023
Weifeng Yang
Wenwen Min
ArXivPDFHTML
Abstract

Tensor data represents a multidimensional array. Regression methods based on low-rank tensor decomposition leverage structural information to reduce the parameter count. Multilinear logistic regression serves as a powerful tool for the analysis of multidimensional data. To improve its efficacy and interpretability, we present a Multilinear Sparse Logistic Regression model with ℓ0\ell_0ℓ0​-constraints (ℓ0\ell_0ℓ0​-MLSR). In contrast to the ℓ1\ell_1ℓ1​-norm and ℓ2\ell_2ℓ2​-norm, the ℓ0\ell_0ℓ0​-norm constraint is better suited for feature selection. However, due to its nonconvex and nonsmooth properties, solving it is challenging and convergence guarantees are lacking. Additionally, the multilinear operation in ℓ0\ell_0ℓ0​-MLSR also brings non-convexity. To tackle these challenges, we propose an Accelerated Proximal Alternating Linearized Minimization with Adaptive Momentum (APALM+^++) method to solve the ℓ0\ell_0ℓ0​-MLSR model. We provide a proof that APALM+^++ can ensure the convergence of the objective function of ℓ0\ell_0ℓ0​-MLSR. We also demonstrate that APALM+^++ is globally convergent to a first-order critical point as well as establish convergence rate by using the Kurdyka-Lojasiewicz property. Empirical results obtained from synthetic and real-world datasets validate the superior performance of our algorithm in terms of both accuracy and speed compared to other state-of-the-art methods.

View on arXiv
Comments on this paper