Written by : Jayati Dubey
April 8, 2025
The study found that despite identical clinical data, the AI models sometimes altered their triage decisions, diagnostic orders, or treatment suggestions based solely on nonclinical attributes such as income, race, or sexual orientation.
According to a new study from the Icahn School of Medicine at Mount Sinai, New York, generative artificial intelligence (AI) models may change medical recommendations based on a patient’s demographic or socioeconomic background, even when clinical details remain the same. The findings were published on April 7 in Nature Medicine.
Researchers tested nine large language models using 1,000 emergency department case scenarios, each replicated across 32 distinct patient profiles.
The study generated 1.7 million AI-based medical recommendations in total. Despite identical clinical data, the models sometimes altered their triage decisions, diagnostic orders, or treatment suggestions based solely on nonclinical attributes such as income, race, or sexual orientation.
One notable finding showed mental health evaluations were recommended six to seven times more frequently for LGBTQIA+ patients than clinically necessary.
Additionally, higher-income patients were more likely to be advised to undergo advanced imaging tests like CT scans or MRIs, while lower-income patients were often told that no further testing was needed.
“Our research provides a framework for AI assurance, helping developers and healthcare institutions design fair and reliable AI tools,” said Dr. Eyal Klang, chief of generative AI in the Windreich Department of Artificial Intelligence and Human Health at Mount Sinai.
The study underscores the need for rigorous evaluation and mitigation of algorithmic bias to ensure equitable medical care.
Researchers emphasized that the results represent a snapshot of current AI performance and plan to conduct clinical pilots to assess real-world impacts and explore prompting techniques to reduce bias.
Stay tuned for more such updates on Digital Health News.