ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2212.00778
19
1

Fully-Dynamic Decision Trees

1 December 2022
M. Bressan
Gabriel Damay
Mauro Sozio
ArXivPDFHTML
Abstract

We develop the first fully dynamic algorithm that maintains a decision tree over an arbitrary sequence of insertions and deletions of labeled examples. Given ϵ>0\epsilon > 0ϵ>0 our algorithm guarantees that, at every point in time, every node of the decision tree uses a split with Gini gain within an additive ϵ\epsilonϵ of the optimum. For real-valued features the algorithm has an amortized running time per insertion/deletion of O(dlog⁡3nϵ2)O\big(\frac{d \log^3 n}{\epsilon^2}\big)O(ϵ2dlog3n​), which improves to O(dlog⁡2nϵ)O\big(\frac{d \log^2 n}{\epsilon}\big)O(ϵdlog2n​) for binary or categorical features, while it uses space O(nd)O(n d)O(nd), where nnn is the maximum number of examples at any point in time and ddd is the number of features. Our algorithm is nearly optimal, as we show that any algorithm with similar guarantees uses amortized running time Ω(d)\Omega(d)Ω(d) and space Ω~(nd)\tilde{\Omega} (n d)Ω~(nd). We complement our theoretical results with an extensive experimental evaluation on real-world data, showing the effectiveness of our algorithm.

View on arXiv
Comments on this paper