ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2307.06565
20
9

Efficient SGD Neural Network Training via Sublinear Activated Neuron Identification

13 July 2023
Lianke Qin
Zhao-quan Song
Yuanyuan Yang
ArXivPDFHTML
Abstract

Deep learning has been widely used in many fields, but the model training process usually consumes massive computational resources and time. Therefore, designing an efficient neural network training method with a provable convergence guarantee is a fundamental and important research question. In this paper, we present a static half-space report data structure that consists of a fully connected two-layer neural network for shifted ReLU activation to enable activated neuron identification in sublinear time via geometric search. We also prove that our algorithm can converge in O(M2/ϵ2)O(M^2/\epsilon^2)O(M2/ϵ2) time with network size quadratic in the coefficient norm upper bound MMM and error term ϵ\epsilonϵ.

View on arXiv
Comments on this paper