ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.01751
63
0

A dynamic view of the double descent

3 May 2025
Vivek Shripad Borkar
ArXivPDFHTML
Abstract

It has been observed by Belkin et al.\ that overparametrized neural networks exhibit a `double descent' phenomenon. That is, as the model complexity, as reflected in the number of features, increases, the training error initially decreases, then increases, and then decreases again. A counterpart of this phenomenon in the time domain has been noted in the context of epoch-wise training, viz., that the training error decreases with time, then increases, then decreases again. This note presents a plausible explanation for this phenomenon by using the theory of two time scale stochastic approximation and singularly perturbed differential equations, applied to the continuous time limit of the gradient dynamics. This adds a `dynamic' angle to an already well studied theme.

View on arXiv
@article{borkar2025_2505.01751,
  title={ A dynamic view of the double descent },
  author={ Vivek Shripad Borkar },
  journal={arXiv preprint arXiv:2505.01751},
  year={ 2025 }
}
Comments on this paper