Principled Curriculum Learning using Parameter Continuation Methods
- ODL

Main:4 Pages
1 Figures
Bibliography:2 Pages
2 Tables
Abstract
In this work, we propose a parameter continuation method for the optimization of neural networks. There is a close connection between parameter continuation, homotopies, and curriculum learning. The methods we propose here are theoretically justified and practically effective for several problems in deep neural networks. In particular, we demonstrate better generalization performance than state-of-the-art optimization techniques such as ADAM for supervised and unsupervised learning tasks.
View on arXivComments on this paper
