ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2506.15954
7
0

One Period to Rule Them All: Identifying Critical Learning Periods in Deep Networks

19 June 2025
Vinicius Yuiti Fukase
Heitor Gama
Bárbara Dias Bueno
Lucas Libanio
A. H. R. Costa
Artur Jordao
ArXiv (abs)PDFHTML
Main:9 Pages
3 Figures
Bibliography:3 Pages
4 Tables
Abstract

Critical Learning Periods comprehend an important phenomenon involving deep learning, where early epochs play a decisive role in the success of many training recipes, such as data augmentation. Existing works confirm the existence of this phenomenon and provide useful insights. However, the literature lacks efforts to precisely identify when critical periods occur. In this work, we fill this gap by introducing a systematic approach for identifying critical periods during the training of deep neural networks, focusing on eliminating computationally intensive regularization techniques and effectively applying mechanisms for reducing computational costs, such as data pruning. Our method leverages generalization prediction mechanisms to pinpoint critical phases where training recipes yield maximum benefits to the predictive ability of models. By halting resource-intensive recipes beyond these periods, we significantly accelerate the learning phase and achieve reductions in training time, energy consumption, and CO2_22​ emissions. Experiments on standard architectures and benchmarks confirm the effectiveness of our method. Specifically, we achieve significant milestones by reducing the training time of popular architectures by up to 59.67%, leading to a 59.47% decrease in CO2_22​ emissions and a 60% reduction in financial costs, without compromising performance. Our work enhances understanding of training dynamics and paves the way for more sustainable and efficient deep learning practices, particularly in resource-constrained environments. In the era of the race for foundation models, we believe our method emerges as a valuable framework. The repository is available atthis https URL

View on arXiv
@article{fukase2025_2506.15954,
  title={ One Period to Rule Them All: Identifying Critical Learning Periods in Deep Networks },
  author={ Vinicius Yuiti Fukase and Heitor Gama and Barbara Bueno and Lucas Libanio and Anna Helena Reali Costa and Artur Jordao },
  journal={arXiv preprint arXiv:2506.15954},
  year={ 2025 }
}
Comments on this paper