ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2408.04206
23
0

DC Algorithm for Estimation of Sparse Gaussian Graphical Models

8 August 2024
Tomokaze Shiratori
Yuichi Takano
ArXivPDFHTML
Abstract

Sparse estimation for Gaussian graphical models is a crucial technique for making the relationships among numerous observed variables more interpretable and quantifiable. Various methods have been proposed, including graphical lasso, which utilizes the ℓ1\ell_1ℓ1​ norm as a regularization term, as well as methods employing non-convex regularization terms. However, most of these methods approximate the ℓ0\ell_0ℓ0​ norm with convex functions. To estimate more accurate solutions, it is desirable to treat the ℓ0\ell_0ℓ0​ norm directly as a regularization term. In this study, we formulate the sparse estimation problem for Gaussian graphical models using the ℓ0\ell_0ℓ0​ norm and propose a method to solve this problem using the Difference of Convex functions Algorithm (DCA). Specifically, we convert the ℓ0\ell_0ℓ0​ norm constraint into an equivalent largest-KKK norm constraint, reformulate the constrained problem into a penalized form, and solve it using the DC algorithm (DCA). Furthermore, we designed an algorithm that efficiently computes using graphical lasso. Experimental results with synthetic data show that our method yields results that are equivalent to or better than existing methods. Comparisons of model learning through cross-validation confirm that our method is particularly advantageous in selecting true edges.

View on arXiv
Comments on this paper