ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2310.04821
16
3

A New Baseline Assumption of Integated Gradients Based on Shaply value

7 October 2023
Shuyang Liu
Zixuan Chen
Ge Shi
Ji Wang
Changjie Fan
Yu Xiong
Runze Wu Yujing Hu
Ze Ji
Yang Gao
ArXivPDFHTML
Abstract

Efforts to decode deep neural networks (DNNs) often involve mapping their predictions back to the input features. Among these methods, Integrated Gradients (IG) has emerged as a significant technique. The selection of appropriate baselines in IG is crucial for crafting meaningful and unbiased explanations of model predictions in diverse settings. The standard approach of utilizing a single baseline, however, is frequently inadequate, prompting the need for multiple baselines. Leveraging the natural link between IG and the Aumann-Shapley Value, we provide a novel outlook on baseline design. Theoretically, we demonstrate that under certain assumptions, a collection of baselines aligns with the coalitions described by the Shapley Value. Building on this insight, we develop a new baseline method called Shapley Integrated Gradients (SIG), which uses proportional sampling to mirror the Shapley Value computation process. Simulations conducted in GridWorld validate that SIG effectively emulates the distribution of Shapley Values. Moreover, empirical tests on various image processing tasks show that SIG surpasses traditional IG baseline methods by offering more precise estimates of feature contributions, providing consistent explanations across different applications, and ensuring adaptability to diverse data types with negligible additional computational demand.

View on arXiv
Comments on this paper