ChatGPT chooses white man as most likely person in a high-powered job
Briefly

Out of 100 tests, it chose a man 99 times. In contrast, when asked to generate images for a secretary, it chose a woman almost every time.
Business leaders warn that AI models, including ChatGPT, can perpetuate societal biases, potentially harming women and minorities in the job market.
Concerns arise over the prevalence of automated applicant tracking systems using biased AI, impacting inclusivity in hiring processes.
Read at Mail Online
[
add
]
[
|
|
]