ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2307.01932
  4. Cited By
MDI+: A Flexible Random Forest-Based Feature Importance Framework

MDI+: A Flexible Random Forest-Based Feature Importance Framework

4 July 2023
Abhineet Agarwal
Ana M. Kenney
Yan Shuo Tan
Tiffany M. Tang
Bin-Xia Yu
ArXivPDFHTML

Papers citing "MDI+: A Flexible Random Forest-Based Feature Importance Framework"

5 / 5 papers shown
Title
TrIM: Transformed Iterative Mondrian Forests for Gradient-based
  Dimension Reduction and High-Dimensional Regression
TrIM: Transformed Iterative Mondrian Forests for Gradient-based Dimension Reduction and High-Dimensional Regression
Ricardo Baptista
Eliza O'Reilly
Yangxinyu Xie
34
2
0
13 Jul 2024
Hierarchical Shrinkage: improving the accuracy and interpretability of
  tree-based methods
Hierarchical Shrinkage: improving the accuracy and interpretability of tree-based methods
Abhineet Agarwal
Yan Shuo Tan
Omer Ronen
Chandan Singh
Bin-Xia Yu
65
27
0
02 Feb 2022
Fast Interpretable Greedy-Tree Sums
Fast Interpretable Greedy-Tree Sums
Yan Shuo Tan
Chandan Singh
Keyan Nasseri
Abhineet Agarwal
James Duncan
Omer Ronen
M. Epland
Aaron E. Kornblith
Bin-Xia Yu
AI4CE
24
6
0
28 Jan 2022
MDA for random forests: inconsistency, and a practical solution via the
  Sobol-MDA
MDA for random forests: inconsistency, and a practical solution via the Sobol-MDA
Clément Bénard
Sébastien Da Veiga
Erwan Scornet
45
49
0
26 Feb 2021
Unbiased variable importance for random forests
Unbiased variable importance for random forests
Markus Loecher
FAtt
49
53
0
04 Mar 2020
1