ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1810.11905
20
50

Sparse Logistic Regression Learns All Discrete Pairwise Graphical Models

28 October 2018
Shanshan Wu
Sujay Sanghavi
A. Dimakis
ArXivPDFHTML
Abstract

We characterize the effectiveness of a classical algorithm for recovering the Markov graph of a general discrete pairwise graphical model from i.i.d. samples. The algorithm is (appropriately regularized) maximum conditional log-likelihood, which involves solving a convex program for each node; for Ising models this is ℓ1\ell_1ℓ1​-constrained logistic regression, while for more general alphabets an ℓ2,1\ell_{2,1}ℓ2,1​ group-norm constraint needs to be used. We show that this algorithm can recover any arbitrary discrete pairwise graphical model, and also characterize its sample complexity as a function of model width, alphabet size, edge parameter accuracy, and the number of variables. We show that along every one of these axes, it matches or improves on all existing results and algorithms for this problem. Our analysis applies a sharp generalization error bound for logistic regression when the weight vector has an ℓ1\ell_1ℓ1​ constraint (or ℓ2,1\ell_{2,1}ℓ2,1​ constraint) and the sample vector has an ℓ∞\ell_{\infty}ℓ∞​ constraint (or ℓ2,∞\ell_{2, \infty}ℓ2,∞​ constraint). We also show that the proposed convex programs can be efficiently solved in O~(n2)\tilde{O}(n^2)O~(n2) running time (where nnn is the number of variables) under the same statistical guarantees. We provide experimental results to support our analysis.

View on arXiv
Comments on this paper