ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2004.13748
15
38

Learning Polynomials of Few Relevant Dimensions

28 April 2020
Sitan Chen
Raghu Meka
ArXivPDFHTML
Abstract

Polynomial regression is a basic primitive in learning and statistics. In its most basic form the goal is to fit a degree ddd polynomial to a response variable yyy in terms of an nnn-dimensional input vector xxx. This is extremely well-studied with many applications and has sample and runtime complexity Θ(nd)\Theta(n^d)Θ(nd). Can one achieve better runtime if the intrinsic dimension of the data is much smaller than the ambient dimension nnn? Concretely, we are given samples (x,y)(x,y)(x,y) where yyy is a degree at most ddd polynomial in an unknown rrr-dimensional projection (the relevant dimensions) of xxx. This can be seen both as a generalization of phase retrieval and as a special case of learning multi-index models where the link function is an unknown low-degree polynomial. Note that without distributional assumptions, this is at least as hard as junta learning. In this work we consider the important case where the covariates are Gaussian. We give an algorithm that learns the polynomial within accuracy ϵ\epsilonϵ with sample complexity that is roughly N=Or,d(nlog⁡2(1/ϵ)(log⁡n)d)N = O_{r,d}(n \log^2(1/\epsilon) (\log n)^d)N=Or,d​(nlog2(1/ϵ)(logn)d) and runtime Or,d(Nn2)O_{r,d}(N n^2)Or,d​(Nn2). Prior to our work, no such results were known even for the case of r=1r=1r=1. We introduce a new filtered PCA approach to get a warm start for the true subspace and use geodesic SGD to boost to arbitrary accuracy; our techniques may be of independent interest, especially for problems dealing with subspace recovery or analyzing SGD on manifolds.

View on arXiv
Comments on this paper