47
0

Variational Adaptive Noise and Dropout towards Stable Recurrent Neural Networks

Main:5 Pages
7 Figures
Bibliography:1 Pages
Abstract

This paper proposes a novel stable learning theory for recurrent neural networks (RNNs), so-called variational adaptive noise and dropout (VAND). As stabilizing factors for RNNs, noise and dropout on the internal state of RNNs have been separately confirmed in previous studies. We reinterpret the optimization problem of RNNs as variational inference, showing that noise and dropout can be derived simultaneously by transforming the explicit regularization term arising in the optimization problem into implicit regularization. Their scale and ratio can also be adjusted appropriately to optimize the main objective of RNNs, respectively. In an imitation learning scenario with a mobile manipulator, only VAND is able to imitate sequential and periodic behaviors as instructed.this https URL

View on arXiv
@article{kobayashi2025_2506.01350,
  title={ Variational Adaptive Noise and Dropout towards Stable Recurrent Neural Networks },
  author={ Taisuke Kobayashi and Shingo Murata },
  journal={arXiv preprint arXiv:2506.01350},
  year={ 2025 }
}
Comments on this paper