ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.04886
38
0

Fairness Perceptions in Regression-based Predictive Models

8 May 2025
Mukund Telukunta
Venkata Sriram Siddhardh Nadendla
Morgan Stuart
Casey Canfield
ArXivPDFHTML
Abstract

Regression-based predictive analytics used in modern kidney transplantation is known to inherit biases from training data. This leads to social discrimination and inefficient organ utilization, particularly in the context of a few social groups. Despite this concern, there is limited research on fairness in regression and its impact on organ utilization and placement. This paper introduces three novel divergence-based group fairness notions: (i) independence, (ii) separation, and (iii) sufficiency to assess the fairness of regression-based analytics tools. In addition, fairness preferences are investigated from crowd feedback, in order to identify a socially accepted group fairness criterion for evaluating these tools. A total of 85 participants were recruited from the Prolific crowdsourcing platform, and a Mixed-Logit discrete choice model was used to model fairness feedback and estimate social fairness preferences. The findings clearly depict a strong preference towards the separation and sufficiency fairness notions, and that the predictive analytics is deemed fair with respect to gender and race groups, but unfair in terms of age groups.

View on arXiv
@article{telukunta2025_2505.04886,
  title={ Fairness Perceptions in Regression-based Predictive Models },
  author={ Mukund Telukunta and Venkata Sriram Siddhardh Nadendla and Morgan Stuart and Casey Canfield },
  journal={arXiv preprint arXiv:2505.04886},
  year={ 2025 }
}
Comments on this paper