ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2408.01365
  4. Cited By
Data Debugging is NP-hard for Classifiers Trained with SGD

Data Debugging is NP-hard for Classifiers Trained with SGD

2 August 2024
Zizheng Guo
Pengyu Chen
Yanzhang Fu
Xuelong Li
ArXiv (abs)PDFHTML

Papers citing "Data Debugging is NP-hard for Classifiers Trained with SGD"

5 / 5 papers shown
Title
If Influence Functions are the Answer, Then What is the Question?
If Influence Functions are the Answer, Then What is the Question?
Juhan Bae
Nathan Ng
Alston Lo
Marzyeh Ghassemi
Roger C. Grosse
TDI
95
104
0
12 Sep 2022
Interpretable Data-Based Explanations for Fairness Debugging
Interpretable Data-Based Explanations for Fairness Debugging
Romila Pradhan
Jiongli Zhu
Boris Glavic
Babak Salimi
71
57
0
17 Dec 2021
Does the Order of Training Samples Matter? Improving Neural Data-to-Text
  Generation with Curriculum Learning
Does the Order of Training Samples Matter? Improving Neural Data-to-Text Generation with Curriculum Learning
Ernie Chang
Hui-Syuan Yeh
Vera Demberg
70
40
0
06 Feb 2021
FastIF: Scalable Influence Functions for Efficient Model Interpretation
  and Debugging
FastIF: Scalable Influence Functions for Efficient Model Interpretation and Debugging
Han Guo
Nazneen Rajani
Peter Hase
Joey Tianyi Zhou
Caiming Xiong
TDI
118
116
0
31 Dec 2020
Understanding Black-box Predictions via Influence Functions
Understanding Black-box Predictions via Influence Functions
Pang Wei Koh
Percy Liang
TDI
219
2,910
0
14 Mar 2017
1