55
21

Whiteout: Gaussian Adaptive Regularization Noise in Deep Neural Networks

Abstract

Noise injection (NI) is an off-the-shelf method to mitigate over-fitting in neural networks (NNs). The recent developments in Bernoulli NI as implemented in the dropout and shakeout procedures demonstrates the efficiency and feasibility of NI in regularizing deep NNs. We propose whiteout, a new regularization technique via injection of adaptive Gaussian noises into deep NNs. We show that whiteout is associated with a deterministic optimization objective function in generalized linear models with a closed-form penalty term which has connections with the bridge, lasso, ridge, and elastic net penalization; and it can be also extended to offer regularization similar to the adaptive lasso and group lasso regression. We also demonstrate that whiteout can be viewed as robust learning of NN model in the presence of small perturbations in input and hidden nodes. Compared to dropout, whiteout has better performance in training data of relatively small sizes with the sparsity introduced through the l1l_1 regularization. Compared to shakeout, the penalized objective function in whiteout is more stable given the continuity of Gaussian noises. We establish theoretically that the noise-perturbed empirical loss function with whiteout converges almost surely to the ideal loss function, and the estimates of NN parameters obtained from minimizing the former loss function are consistent with those obtained from minimizing the ideal loss function. Computationally, whiteout can be incorporated in the back-propagation algorithm and is computationally efficient. The superiority of whiteout over dropout and shakeout in training NNs in classification is demonstrated using the MNIST and CIFAR-10 data.

View on arXiv
Comments on this paper