14
10

Universal guarantees for decision tree induction via a higher-order splitting criterion

Abstract

We propose a simple extension of top-down decision tree learning heuristics such as ID3, C4.5, and CART. Our algorithm achieves provable guarantees for all target functions f:{1,1}n{1,1}f: \{-1,1\}^n \to \{-1,1\} with respect to the uniform distribution, circumventing impossibility results showing that existing heuristics fare poorly even for simple target functions. The crux of our extension is a new splitting criterion that takes into account the correlations between ff and small subsets of its attributes. The splitting criteria of existing heuristics (e.g. Gini impurity and information gain), in contrast, are based solely on the correlations between ff and its individual attributes. Our algorithm satisfies the following guarantee: for all target functions f:{1,1}n{1,1}f : \{-1,1\}^n \to \{-1,1\}, sizes sNs\in \mathbb{N}, and error parameters ϵ\epsilon, it constructs a decision tree of size sO~((logs)2/ϵ2)s^{\tilde{O}((\log s)^2/\epsilon^2)} that achieves error O(opts)+ϵ\le O(\mathsf{opt}_s) + \epsilon, where opts\mathsf{opt}_s denotes the error of the optimal size ss decision tree. A key technical notion that drives our analysis is the noise stability of ff, a well-studied smoothness measure.

View on arXiv
Comments on this paper