ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1501.00442
63
11
v1v2 (latest)

Joint Rank and Variable Selection for Parsimonious Estimation in High-Dimension Finite Mixture Regression Model

2 January 2015
Emilie Devijver
ArXiv (abs)PDFHTML
Abstract

We study a dimension reduction method for finite mixture of mul-tivariate response regression models in high dimension. Both the number of responses and of predictors may exceed the sample size. We consider jointly predictor selection and rank reduction for obtaining lower-dimensional approx-imations of parameter matrices. This methodology was already developed in [8]. In this paper, we prove that these estimators are adaptive to the unknown matrix sparsity. More precisely, we exhibit a penalty for which the model se-lected by the penalized likelihood satisfies an oracle inequality. We support our theoretical result with simulation study and data analysis.

View on arXiv
Comments on this paper