20
17

Shrinkage degree in L2L_2-re-scale boosting for regression

Abstract

Re-scale boosting (RBoosting) is a variant of boosting which can essentially improve the generalization performance of boosting learning. The key feature of RBoosting lies in introducing a shrinkage degree to re-scale the ensemble estimate in each gradient-descent step. Thus, the shrinkage degree determines the performance of RBoosting. The aim of this paper is to develop a concrete analysis concerning how to determine the shrinkage degree in L2L_2-RBoosting. We propose two feasible ways to select the shrinkage degree. The first one is to parameterize the shrinkage degree and the other one is to develope a data-driven approach of it. After rigorously analyzing the importance of the shrinkage degree in L2L_2-RBoosting learning, we compare the pros and cons of the proposed methods. We find that although these approaches can reach the same learning rates, the structure of the final estimate of the parameterized approach is better, which sometimes yields a better generalization capability when the number of sample is finite. With this, we recommend to parameterize the shrinkage degree of L2L_2-RBoosting. To this end, we present an adaptive parameter-selection strategy for shrinkage degree and verify its feasibility through both theoretical analysis and numerical verification. The obtained results enhance the understanding of RBoosting and further give guidance on how to use L2L_2-RBoosting for regression tasks.

View on arXiv
Comments on this paper