
"On Thursday, officials in the UK pledged to roll out a country-wide facial recognition system to help police track down criminals. The country's ministers have launched a 10-week consultation to analyze the regulatory and privacy framework of their AI-powered surveillance panopticon - but one way or another, the all-seeing eye is on its way. There's just one tiny wrinkle: the AI facial recognition cameras have a tendency to misidentify non-white people."
"According to the NPL analysis, the national "retrospective facial recognition tool" - one of three types of facial recognition software used by national police - has a "false positive identification rate for white subjects (0.04 percent)," which is "lower than that for Asian subjects (4.0 percent) and black subjects (5.5 percent)." "This has meant that in some circumstances it is more likely to incorrectly match Black and Asian people than their white counterparts,""
UK officials pledged a country-wide AI facial recognition system to help police track criminals while launching a 10-week consultation on regulatory and privacy frameworks. Testing by the National Physical Laboratory found the technology more likely to incorrectly include Black and Asian people in search results. The NPL reported a false positive identification rate of 0.04% for white subjects, 4.0% for Asian subjects, and 5.5% for black subjects for the national retrospective facial recognition tool. National police minister Sarah Jones hailed the tech as a major breakthrough. Lower-ranking police commissioners and the Association of Police and Crime Commissioners warned of inbuilt bias and inadequate operational safeguards. London already remains highly surveilled.
Read at Futurism
Unable to calculate read time
Collection
[
|
...
]