Research reveals that AI tools used by over half of England's councils may create gender bias in care decisions. Analysis of 29,616 summaries from real case notes showed language associated with men was more prominent, describing complex health needs, while similar needs in women were downplayed or omitted. Google's AI model, Gemma, was specifically noted for underrepresenting women's physical and mental health issues. The implications suggest that biased AI could lead to unequal care provisioning, favoring men, raising concerns on the impact of AI in social care.
The study found that when using Google's AI tool Gemma, language such as disabled, unable and complex appeared significantly more often in descriptions of men than women.
AI tools used by more than half of England's councils risk creating gender bias in care decisions by downplaying women's physical and mental health issues.
The LSE research revealed that similar care needs in women were more likely to be omitted or described in less serious terms compared to men's.
Dr. Sam Rickman indicated that the use of biased AI models could lead to unequal care provision, resulting in women potentially receiving less care.
Collection
[
|
...
]