342

SGLB: Stochastic Gradient Langevin Boosting

International Conference on Machine Learning (ICML), 2020
Abstract

In this paper, we introduce Stochastic Gradient Langevin Boosting (SGLB) - a powerful and efficient machine learning framework, which may deal with a wide range of loss functions and has provable generalization guarantees. The method is based on a special form of the Langevin diffusion equation specifically designed for gradient boosting. This allows us to guarantee the global convergence even for multimodal loss functions, while standard gradient boosting algorithms can guarantee only local optimum. SGLB is implemented as a part of the CatBoost gradient boosting library and it outperforms classic gradient boosting when applied to classification tasks with 0-1 loss function, which is known to be multimodal.

View on arXiv
Comments on this paper