Provable guarantees for decision tree induction: the agnostic setting

We give strengthened provable guarantees on the performance of widely employed and empirically successful {\sl top-down decision tree learning heuristics}. While prior works have focused on the realizable setting, we consider the more realistic and challenging {\sl agnostic} setting. We show that for all monotone functions~ and parameters , these heuristics construct a decision tree of size that achieves error , where denotes the error of the optimal size- decision tree for . Previously, such a guarantee was not known to be achievable by any algorithm, even one that is not based on top-down heuristics. We complement our algorithmic guarantee with a near-matching lower bound.
View on arXiv