21
9

Feedforward and Recurrent Neural Networks Backward Propagation and Hessian in Matrix Form

Abstract

In this paper we focus on the linear algebra theory behind feedforward (FNN) and recurrent (RNN) neural networks. We review backward propagation, including backward propagation through time (BPTT). Also, we obtain a new exact expression for Hessian, which represents second order effects. We show that for tt time steps the weight gradient can be expressed as a rank-tt matrix, while the weight Hessian is as a sum of t2t^{2} Kronecker products of rank-11 and WTAWW^{T}AW matrices, for some matrix AA and weight matrix WW. Also, we show that for a mini-batch of size rr, the weight update can be expressed as a rank-rtrt matrix. Finally, we briefly comment on the eigenvalues of the Hessian matrix.

View on arXiv
Comments on this paper