ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2311.04673
22
0

Compressive Recovery of Sparse Precision Matrices

8 November 2023
Titouan Vayer
Etienne Lasalle
Rémi Gribonval
Paulo Gonçalves
ArXivPDFHTML
Abstract

We consider the problem of learning a graph modeling the statistical relations of the ddd variables from a dataset with nnn samples X∈Rn×dX \in \mathbb{R}^{n \times d}X∈Rn×d. Standard approaches amount to searching for a precision matrix Θ\ThetaΘ representative of a Gaussian graphical model that adequately explains the data. However, most maximum likelihood-based estimators usually require storing the d2d^{2}d2 values of the empirical covariance matrix, which can become prohibitive in a high-dimensional setting. In this work, we adopt a compressive viewpoint and aim to estimate a sparse Θ\ThetaΘ from a \emph{sketch} of the data, i.e. a low-dimensional vector of size m≪d2m \ll d^{2}m≪d2 carefully designed from XXX using non-linear random features. Under certain assumptions on the spectrum of Θ\ThetaΘ (or its condition number), we show that it is possible to estimate it from a sketch of size m=Ω((d+2k)log⁡(d))m=\Omega\left((d+2k)\log(d)\right)m=Ω((d+2k)log(d)) where kkk is the maximal number of edges of the underlying graph. These information-theoretic guarantees are inspired by compressed sensing theory and involve restricted isometry properties and instance optimal decoders. We investigate the possibility of achieving practical recovery with an iterative algorithm based on the graphical lasso, viewed as a specific denoiser. We compare our approach and graphical lasso on synthetic datasets, demonstrating its favorable performance even when the dataset is compressed.

View on arXiv
Comments on this paper