The NYPD operates with over 48,000 full-time staff and a budget nearing $6 billion, making its resources comparable to some national militaries. Between 2007 and 2020 the department spent over $2.8 billion on surveillance technologies such as stingray phone trackers, crime-prediction software, and X-ray vans. The department has built a facial recognition suite since 2011 that produced a list of possible matches from grainy CCTV. That system contributed to the wrongful arrest of Trevis Williams after detectives included his photo in a lineup and a victim identified him. Investigators acknowledged that algorithmic matches alone were not probable cause, yet the identification led to Williams being flagged as a suspect.
The New York Police Department is one of the most heavily equipped police forces on the planet. With over 48,000 full-time staff and a budget nearing $6 billion, the NYPD's access to immense resources put itcloser to the military force of entire countries like the Philippines and Iraq than a municipal agency. With that kind of coin to play around with, it's probably no surprise that the NYPD would throw a little toward tech. Between 2007 and 2020, the department has spent over $2.8 billion amassing a virtual toy chest of surveillance tech, including stingray phone trackers, crime prediction software, and even X-ray vans.
Among the NYPD's most contentious technologies is a facial recognition suite it's been building since 2011 - a system that recently led to the arrest of a man who was a whole head taller than the original suspect. As the New York Times reports, a father named Trevis Williams was wrongly arrested on April 21 after the NYPD's facial recognition software lumped him in with a group of "possible matches" based on grainy CCTV video. Investigators were trying to identify a man who exposed himself to a woman two months earlier in February. After feeding their algorithm low quality footage of the crime, it spat out six faces of identical looking men - all African American with facial hair and dreadlocks, the NYT reports.
The AI selection on its own was "not probably cause to arrest," and an investigator noted that anyone identified from the NYPD database by the facial recognition system was only a potential suspect. Still, detectives included Williams' image in the photo lineup anyway - a notoriously unreliable process even without the AI dystopia factor. When the victim confidently asserted that Williams was the perpetrator, police had the probable cause they needed to flag him as a suspect.
Collection
[
|
...
]