False arrests and wrongful convictions: Why AI gets policing wrong
Briefly

False arrests and wrongful convictions: Why AI gets policing wrong
"Within moments police cars arrived, officers drew their weapons and Allen was forced to his knees and handcuffed while they searched him. All they found was a crumpled bag of chips. The AI 's misidentification and the human decisions that followed turned a normal evening into a traumatic confrontation."
"Police had arrested her at gunpoint while she was babysitting her four grandchildren. These are unfortunate examples of how AI can lead to mistreatment of people because of technical flaws as well as misplaced human faith in the technology's supposed objectivity."
"AI systems produce probabilities, and people treat them as certainties. We have seen how quickly the shift from probabilistic prediction to operational certainty happens in practice. Once a system signals a possible threat, the question is no longer how certain the prediction is but what to do about it."
"AI policing tools are used in dozens of U.S. cities, although no public registry tracks the full footprint. The tools ingest historical crime data and score neighborhoods on predicted risk so officers can be routed toward the resulting hot spots. The mechanism is straightforward, but its consequence is not."
An AI-enhanced surveillance camera misidentified a Doritos bag as a gun, leading to officers drawing weapons, forcing a student to his knees, handcuffing him, and searching him despite finding only chips. A facial recognition system incorrectly linked a Tennessee grandmother to fraud crimes in North Dakota, resulting in her arrest at gunpoint while babysitting and five months in jail before release. These cases show how technical flaws and misplaced trust in technology can cause mistreatment. AI systems generate probabilities, but people often treat those probabilities as certainties. Police departments use AI tools that ingest historical crime data, score neighborhoods on predicted risk, and route officers to predicted hot spots, turning statistical outputs into operational decisions.
Read at Fast Company
Unable to calculate read time
[
|
]