19
0

Sacred or Secular? Religious Bias in AI-Generated Financial Advice

Abstract

This study examines religious biases in AI-generated financial advice, focusing on ChatGPT's responses to financial queries. Using a prompt-based methodology and content analysis, we find that 50% of the financial emails generated by ChatGPT exhibit religious biases, with explicit biases present in both ingroup and outgroup interactions. While ingroup biases personalize responses based on religious alignment, outgroup biases introduce religious framing that may alienate clients or create ideological friction. These findings align with broader research on AI bias and suggest that ChatGPT is not merely reflecting societal biases but actively shaping financial discourse based on perceived religious identity. Using the Critical Algorithm Studies framework, we argue that ChatGPT functions as a mediator of financial narratives, selectively reinforcing religious perspectives. This study underscores the need for greater transparency, bias mitigation strategies, and regulatory oversight to ensure neutrality in AI-driven financial services.

View on arXiv
@article{khan2025_2504.07118,
  title={ Sacred or Secular? Religious Bias in AI-Generated Financial Advice },
  author={ Muhammad Salar Khan and Hamza Umer },
  journal={arXiv preprint arXiv:2504.07118},
  year={ 2025 }
}
Comments on this paper