ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2012.02792
  4. Cited By
Weight Update Skipping: Reducing Training Time for Artificial Neural
  Networks

Weight Update Skipping: Reducing Training Time for Artificial Neural Networks

5 December 2020
Pooneh Safayenikoo
Ismail Akturk
ArXivPDFHTML

Papers citing "Weight Update Skipping: Reducing Training Time for Artificial Neural Networks"

3 / 3 papers shown
Title
FreezeOut: Accelerate Training by Progressively Freezing Layers
FreezeOut: Accelerate Training by Progressively Freezing Layers
Andrew Brock
Theodore Lim
J. Ritchie
Nick Weston
37
123
0
15 Jun 2017
Deep Networks with Stochastic Depth
Deep Networks with Stochastic Depth
Gao Huang
Yu Sun
Zhuang Liu
Daniel Sedra
Kilian Q. Weinberger
151
2,344
0
30 Mar 2016
Improving neural networks by preventing co-adaptation of feature
  detectors
Improving neural networks by preventing co-adaptation of feature detectors
Geoffrey E. Hinton
Nitish Srivastava
A. Krizhevsky
Ilya Sutskever
Ruslan Salakhutdinov
VLM
385
7,650
0
03 Jul 2012
1