ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2203.15994
30
0

Optimal Learning

30 March 2022
P. Binev
A. Bonito
Ronald A. DeVore
G. Petrova
    FedML
ArXivPDFHTML
Abstract

This paper studies the problem of learning an unknown function fff from given data about fff. The learning problem is to give an approximation f^\hat ff^​ to fff that predicts the values of fff away from the data. There are numerous settings for this learning problem depending on (i) what additional information we have about fff (known as a model class assumption), (ii) how we measure the accuracy of how well f^\hat ff^​ predicts fff, (iii) what is known about the data and data sites, (iv) whether the data observations are polluted by noise. A mathematical description of the optimal performance possible (the smallest possible error of recovery) is known in the presence of a model class assumption. Under standard model class assumptions, it is shown in this paper that a near optimal f^\hat ff^​ can be found by solving a certain discrete over-parameterized optimization problem with a penalty term. Here, near optimal means that the error is bounded by a fixed constant times the optimal error. This explains the advantage of over-parameterization which is commonly used in modern machine learning. The main results of this paper prove that over-parameterized learning with an appropriate loss function gives a near optimal approximation f^\hat ff^​ of the function fff from which the data is collected. Quantitative bounds are given for how much over-parameterization needs to be employed and how the penalization needs to be scaled in order to guarantee a near optimal recovery of fff. An extension of these results to the case where the data is polluted by additive deterministic noise is also given.

View on arXiv
Comments on this paper