She didn't get an apartment because of an AI-generated score and sued to help others avoid the same fate
Briefly

The tenant screening tool SafeRent denied Mary Louis's application without explaining how her score was calculated or what it signified, highlighting a lack of transparency in AI decisions.
Over 400 tenants in Massachusetts, primarily Black and Hispanic, faced similar issues with algorithmic screening tools, demonstrating systemic biases in housing access.
Read at www.theguardian.com
[
|
]