20
5

Gradient Boosted Binary Histogram Ensemble for Large-scale Regression

Abstract

In this paper, we propose a gradient boosting algorithm for large-scale regression problems called \textit{Gradient Boosted Binary Histogram Ensemble} (GBBHE) based on binary histogram partition and ensemble learning. From the theoretical perspective, by assuming the H\"{o}lder continuity of the target function, we establish the statistical convergence rate of GBBHE in the space C0,αC^{0,\alpha} and C1,0C^{1,0}, where a lower bound of the convergence rate for the base learner demonstrates the advantage of boosting. Moreover, in the space C1,0C^{1,0}, we prove that the number of iterations to achieve the fast convergence rate can be reduced by using ensemble regressor as the base learner, which improves the computational efficiency. In the experiments, compared with other state-of-the-art algorithms such as gradient boosted regression tree (GBRT), Breiman's forest, and kernel-based methods, our GBBHE algorithm shows promising performance with less running time on large-scale datasets.

View on arXiv
Comments on this paper