ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1903.05965
  4. Cited By
Rectified Decision Trees: Towards Interpretability, Compression and
  Empirical Soundness

Rectified Decision Trees: Towards Interpretability, Compression and Empirical Soundness

14 March 2019
Jiawang Bai
Yiming Li
Jiawei Li
Yong Jiang
Shutao Xia
ArXivPDFHTML

Papers citing "Rectified Decision Trees: Towards Interpretability, Compression and Empirical Soundness"

4 / 4 papers shown
Title
Knowledge Distillation of Convolutional Neural Networks through Feature
  Map Transformation using Decision Trees
Knowledge Distillation of Convolutional Neural Networks through Feature Map Transformation using Decision Trees
Maddimsetti Srinivas
Debdoot Sheet
FAtt
37
0
0
10 Mar 2024
Using Knowledge Distillation to improve interpretable models in a retail
  banking context
Using Knowledge Distillation to improve interpretable models in a retail banking context
Maxime Biehler
Mohamed Guermazi
Célim Starck
62
2
0
30 Sep 2022
Training Interpretable Convolutional Neural Networks by Differentiating
  Class-specific Filters
Training Interpretable Convolutional Neural Networks by Differentiating Class-specific Filters
Haoyun Liang
Zhihao Ouyang
Yuyuan Zeng
Hang Su
Zihao He
Shutao Xia
Jun Zhu
Bo Zhang
16
47
0
16 Jul 2020
Born-Again Tree Ensembles
Born-Again Tree Ensembles
Thibaut Vidal
Toni Pacheco
Maximilian Schiffer
62
53
0
24 Mar 2020
1