ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1010.0311
142
957

High-dimensional Ising model selection using ℓ1{\ell_1}ℓ1​-regularized logistic regression

2 October 2010
Pradeep Ravikumar
Martin J. Wainwright
John D. Lafferty
ArXivPDFHTML
Abstract

We consider the problem of estimating the graph associated with a binary Ising Markov random field. We describe a method based on ℓ1\ell_1ℓ1​-regularized logistic regression, in which the neighborhood of any given node is estimated by performing logistic regression subject to an ℓ1\ell_1ℓ1​-constraint. The method is analyzed under high-dimensional scaling in which both the number of nodes ppp and maximum neighborhood size ddd are allowed to grow as a function of the number of observations nnn. Our main results provide sufficient conditions on the triple (n,p,d)(n,p,d)(n,p,d) and the model parameters for the method to succeed in consistently estimating the neighborhood of every node in the graph simultaneously. With coherence conditions imposed on the population Fisher information matrix, we prove that consistent neighborhood selection can be obtained for sample sizes n=Ω(d3log⁡p)n=\Omega(d^3\log p)n=Ω(d3logp) with exponentially decaying error. When these same conditions are imposed directly on the sample matrices, we show that a reduced sample size of n=Ω(d2log⁡p)n=\Omega(d^2\log p)n=Ω(d2logp) suffices for the method to estimate neighborhoods consistently. Although this paper focuses on the binary graphical models, we indicate how a generalization of the method of the paper would apply to general discrete Markov random fields.

View on arXiv
Comments on this paper