ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2205.03601
  4. Cited By
ConceptDistil: Model-Agnostic Distillation of Concept Explanations

ConceptDistil: Model-Agnostic Distillation of Concept Explanations

7 May 2022
Joao Bento Sousa
Ricardo Moreira
Vladimir Balayan
Pedro Saleiro
P. Bizarro
    FAtt
ArXivPDFHTML

Papers citing "ConceptDistil: Model-Agnostic Distillation of Concept Explanations"

3 / 3 papers shown
Title
InFiConD: Interactive No-code Fine-tuning with Concept-based Knowledge
  Distillation
InFiConD: Interactive No-code Fine-tuning with Concept-based Knowledge Distillation
Jinbin Huang
Wenbin He
Liang Gou
Liu Ren
Chris Bryan
50
0
0
25 Jun 2024
DiConStruct: Causal Concept-based Explanations through Black-Box
  Distillation
DiConStruct: Causal Concept-based Explanations through Black-Box Distillation
Ricardo Moreira
Jacopo Bono
Mário Cardoso
Pedro Saleiro
Mário A. T. Figueiredo
P. Bizarro
CML
28
4
0
16 Jan 2024
Neural Oblivious Decision Ensembles for Deep Learning on Tabular Data
Neural Oblivious Decision Ensembles for Deep Learning on Tabular Data
Sergei Popov
S. Morozov
Artem Babenko
LMTD
91
294
0
13 Sep 2019
1