Whiteout: Gaussian Adaptive Regularization Noise in Deep Neural Networks

Noise injection is an off-the-shelf method to mitigate over-fitting in neural networks (NNs). The recent developments in Bernoulli noise injection as implemented in the dropout and shakeout procedures demonstrates the efficiency and feasibility of noise injection in regularizing deep NNs. We propose whiteout, a new regularization technique via injection of adaptive Gaussian noises into a deep NN. Whiteout offers three tuning parameters, offering flexibility during training of NNs. We show that whiteout is associated with a deterministic optimization objective function in the context of generalized linear models with a closed-form penalty term and includes lasso, ridge regression, adaptive lasso, and elastic net as special cases. We also demonstrate that whiteout can be viewed as robust learning of NN model in the presence of small and insignificant perturbations in input and hidden nodes. Compared to dropout, whiteout has better performance when training data of relatively small sizes with the sparsity introduced through the regularization. Compared to shakeout, the penalized objective function in whiteout has better convergence behaviors and is more stable given the continuity of injected noises. We establish theoretically that the noise-perturbed empirical loss function with whiteout converges almost surely to the ideal loss function, and the estimates of NN parameters obtained from minimizing the former loss function are consistent with those obtained from minimizing the ideal loss function. Computationally, whiteout can be incorporated in the back-propagation algorithm and is computationally efficient. The superiority of whiteout over dropout and shakeout in training NNs in classification is demonstrated using the MNIST data.
View on arXiv