ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2504.18498
33
0

Enhancing Visual Interpretability and Explainability in Functional Survival Trees and Forests

25 April 2025
Giuseppe Loffredo
Elvira Romano
Fabrizio Maturo
ArXivPDFHTML
Abstract

Functional survival models are key tools for analyzing time-to-event data with complex predictors, such as functional or high-dimensional inputs. Despite their predictive strength, these models often lack interpretability, which limits their value in practical decision-making and risk analysis. This study investigates two key survival models: the Functional Survival Tree (FST) and the Functional Random Survival Forest (FRSF). It introduces novel methods and tools to enhance the interpretability of FST models and improve the explainability of FRSF ensembles. Using both real and simulated datasets, the results demonstrate that the proposed approaches yield efficient, easy-to-understand decision trees that accurately capture the underlying decision-making processes of the model ensemble.

View on arXiv
@article{loffredo2025_2504.18498,
  title={ Enhancing Visual Interpretability and Explainability in Functional Survival Trees and Forests },
  author={ Giuseppe Loffredo and Elvira Romano and Fabrizio MAturo },
  journal={arXiv preprint arXiv:2504.18498},
  year={ 2025 }
}
Comments on this paper