ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2105.04187
  4. Cited By
A Rigorous Information-Theoretic Definition of Redundancy and Relevancy
  in Feature Selection Based on (Partial) Information Decomposition

A Rigorous Information-Theoretic Definition of Redundancy and Relevancy in Feature Selection Based on (Partial) Information Decomposition

10 May 2021
Patricia Wollstadt
Sebastian Schmitt
Michael Wibral
ArXivPDFHTML

Papers citing "A Rigorous Information-Theoretic Definition of Redundancy and Relevancy in Feature Selection Based on (Partial) Information Decomposition"

5 / 5 papers shown
Title
Quantifying Knowledge Distillation Using Partial Information Decomposition
Quantifying Knowledge Distillation Using Partial Information Decomposition
Pasan Dissanayake
Faisal Hamman
Barproda Halder
Ilia Sucholutsky
Qiuyi Zhang
Sanghamitra Dutta
36
0
0
12 Nov 2024
Partial Information Decomposition for Data Interpretability and Feature
  Selection
Partial Information Decomposition for Data Interpretability and Feature Selection
Charles Westphal
Stephen Hailes
Mirco Musolesi
40
0
0
29 May 2024
Information-Theoretic State Variable Selection for Reinforcement
  Learning
Information-Theoretic State Variable Selection for Reinforcement Learning
Charles Westphal
Stephen Hailes
Mirco Musolesi
26
3
0
21 Jan 2024
Measuring Multi-Source Redundancy in Factor Graphs
Measuring Multi-Source Redundancy in Factor Graphs
Jesse Milzman
Andre V. Harrison
Carlos Nieto-Granda
J. Rogers
21
0
0
13 Mar 2023
Estimating the Unique Information of Continuous Variables
Estimating the Unique Information of Continuous Variables
Ari Pakman
Amin Nejatbakhsh
D. Gilboa
Abdullah Makkeh
Luca Mazzucato
Michael Wibral
E. Schneidman
43
24
0
30 Jan 2021
1