UK MoJ crime prediction algorithms raise serious concerns | Computer Weekly
Briefly

The UK's Ministry of Justice is employing predictive algorithms to assess individuals' risk of committing crimes, raising serious concerns about entrenched discrimination due to reliance on historically biased data. Critics, including the pressure group Statewatch, emphasize that these tools often target marginalized communities as they are overrepresented in police statistics. As flawed algorithms are used, they contribute to a cycle of over-policing leading to a heightened presence of these communities in criminal datasets. This process undermines efforts towards equitable law enforcement and evokes criticism regarding the objectivity claimed by predictive policing systems.
The MoJ's use of data-based profiling tools to predict criminal reoffending risks raises concerns over entrenched discrimination against historically marginalized communities.
The implementation of biased algorithms could create a negative feedback loop of over-policing, further exacerbating existing systemic discrimination in law enforcement.
Authors David Correia and Tyler Wall emphasize that predictive policing gives the false appearance of objectivity, allowing discrimination to persist without overt racial profiling.
The MoJ faces scrutiny as documents reveal reliance on flawed predictive algorithms while attempts to discuss bias in policing systems remain unanswered.
Read at ComputerWeekly.com
[
|
]