ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1901.06116
29
11
v1v2v3 (latest)

Nonconvex Rectangular Matrix Completion via Gradient Descent without ℓ2,∞\ell_{2,\infty}ℓ2,∞​ Regularization

18 January 2019
Ji Chen
Dekai Liu
Xiaodong Li
ArXiv (abs)PDFHTML
Abstract

The analysis of nonconvex matrix completion has recently attracted much attention in the community of machine learning thanks to its computational convenience. Existing analysis on this problem, however, usually relies on ℓ2,∞\ell_{2,\infty}ℓ2,∞​ projection or regularization that involves unknown model parameters, although they are observed to be unnecessary in numerical simulations, see, e.g., Zheng and Lafferty [2016]. In this paper, we extend the analysis of the vanilla gradient descent for positive semidefinite matrix completion proposed in Ma et al. [2017] to the rectangular case, and more significantly, improve the required sampling rate from O(poly⁡(κ)μ3r3log⁡3n/n)O(\operatorname{poly}(\kappa)\mu^3 r^3 \log^3 n/n )O(poly(κ)μ3r3log3n/n) to O(μ2r2κ14log⁡n/n)O(\mu^2 r^2 \kappa^{14} \log n/n )O(μ2r2κ14logn/n). Our technical ideas and contributions are potentially useful in improving the leave-one-out analysis in other related problems.

View on arXiv
Comments on this paper