ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1909.03620
9
3

An Adaptive Stochastic Nesterov Accelerated Quasi Newton Method for Training RNNs

9 September 2019
S. Indrapriyadarsini
Shahrzad Mahboubi
H. Ninomiya
H. Asai
    ODL
ArXivPDFHTML
Abstract

A common problem in training neural networks is the vanishing and/or exploding gradient problem which is more prominently seen in training of Recurrent Neural Networks (RNNs). Thus several algorithms have been proposed for training RNNs. This paper proposes a novel adaptive stochastic Nesterov accelerated quasiNewton (aSNAQ) method for training RNNs. The proposed method aSNAQ is an accelerated method that uses the Nesterov's gradient term along with second order curvature information. The performance of the proposed method is evaluated in Tensorflow on benchmark sequence modeling problems. The results show an improved performance while maintaining a low per-iteration cost and thus can be effectively used to train RNNs.

View on arXiv
Comments on this paper