ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.03576
51
0

Optimal Decision Tree Pruning Revisited: Algorithms and Complexity

5 March 2025
Juha Harviainen
Frank Sommer
Manuel Sorge
Stefan Szeider
ArXivPDFHTML
Abstract

We present a comprehensive classical and parameterized complexity analysis of decision tree pruning operations, extending recent research on the complexity of learning small decision trees. Thereby, we offer new insights into the computational challenges of decision tree simplification, a crucial aspect of developing interpretable and efficient machine learning models. We focus on fundamental pruning operations of subtree replacement and raising, which are used in heuristics. Surprisingly, while optimal pruning can be performed in polynomial time for subtree replacement, the problem is NP-complete for subtree raising. Therefore, we identify parameters and combinations thereof that lead to fixed-parameter tractability or hardness, establishing a precise borderline between these complexity classes. For example, while subtree raising is hard for small domain size DDD or number ddd of features, it can be solved in D2d⋅∣I∣O(1)D^{2d} \cdot |I|^{O(1)}D2d⋅∣I∣O(1) time, where ∣I∣|I|∣I∣ is the input size. We complement our theoretical findings with preliminary experimental results, demonstrating the practical implications of our analysis.

View on arXiv
@article{harviainen2025_2503.03576,
  title={ Optimal Decision Tree Pruning Revisited: Algorithms and Complexity },
  author={ Juha Harviainen and Frank Sommer and Manuel Sorge and Stefan Szeider },
  journal={arXiv preprint arXiv:2503.03576},
  year={ 2025 }
}
Comments on this paper