ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2305.13897
31
1

Two Results on Low-Rank Heavy-Tailed Multiresponse Regressions

23 May 2023
Kangqiang Li
Yuxuan Wang
ArXivPDFHTML
Abstract

This paper gives two theoretical results on estimating low-rank parameter matrices for linear models with multivariate responses. We first focus on robust parameter estimation of low-rank multi-task learning with heavy-tailed data and quantization scenarios. It comprises two cases: quantization under heavy-tailed responses and quantization with both heavy-tailed covariate and response variables. For each case, our theory shows that the proposed estimator has a minmax near-optimal convergence rate. We then further investigate low-rank linear models with heavy-tailed matrix-type responses. The theory shows that when the random noise has only (2+ϵ)(2+\epsilon)(2+ϵ)-order moment, our robust estimator still has almost the same statistical convergence rate as that of sub-Gaussian data. Moreover, our simulation experiments confirm the correctness of theories and show the superiority of our estimators.

View on arXiv
Comments on this paper