Decision Trees That Remember: Gradient-Based Learning of Recurrent Decision Trees with Memory

Neural architectures such as Recurrent Neural Networks (RNNs), Transformers, and State-Space Models have shown great success in handling sequential data by learning temporal dependencies. Decision Trees (DTs), on the other hand, remain a widely used class of models for structured tabular data but are typically not designed to capture sequential patterns directly. Instead, DT-based approaches for time-series data often rely on feature engineering, such as manually incorporating lag features, which can be suboptimal for capturing complex temporal dependencies. To address this limitation, we introduce ReMeDe Trees, a novel recurrent DT architecture that integrates an internal memory mechanism, similar to RNNs, to learn long-term dependencies in sequential data. Our model learns hard, axis-aligned decision rules for both output generation and state updates, optimizing them efficiently via gradient descent. We provide a proof-of-concept study on synthetic benchmarks to demonstrate the effectiveness of our approach.
View on arXiv@article{marton2025_2502.04052, title={ Decision Trees That Remember: Gradient-Based Learning of Recurrent Decision Trees with Memory }, author={ Sascha Marton and Moritz Schneider }, journal={arXiv preprint arXiv:2502.04052}, year={ 2025 } }