ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1702.02715
18
9

A Fast and Scalable Joint Estimator for Learning Multiple Related Sparse Gaussian Graphical Models

9 February 2017
Beilun Wang
Ji Gao
Yanjun Qi
ArXivPDFHTML
Abstract

Estimating multiple sparse Gaussian Graphical Models (sGGMs) jointly for many related tasks (large KKK) under a high-dimensional (large ppp) situation is an important task. Most previous studies for the joint estimation of multiple sGGMs rely on penalized log-likelihood estimators that involve expensive and difficult non-smooth optimizations. We propose a novel approach, FASJEM for \underline{fa}st and \underline{s}calable \underline{j}oint structure-\underline{e}stimation of \underline{m}ultiple sGGMs at a large scale. As the first study of joint sGGM using the Elementary Estimator framework, our work has three major contributions: (1) We solve FASJEM through an entry-wise manner which is parallelizable. (2) We choose a proximal algorithm to optimize FASJEM. This improves the computational efficiency from O(Kp3)O(Kp^3)O(Kp3) to O(Kp2)O(Kp^2)O(Kp2) and reduces the memory requirement from O(Kp2)O(Kp^2)O(Kp2) to O(K)O(K)O(K). (3) We theoretically prove that FASJEM achieves a consistent estimation with a convergence rate of O(log⁡(Kp)/ntot)O(\log(Kp)/n_{tot})O(log(Kp)/ntot​). On several synthetic and four real-world datasets, FASJEM shows significant improvements over baselines on accuracy, computational complexity, and memory costs.

View on arXiv
Comments on this paper