Home Office admits facial recognition tech issue with black and Asian subjects
Briefly

Home Office admits facial recognition tech issue with black and Asian subjects
"Ministers are facing calls for stronger safeguards on facial recognition technology after the Home Office admitted that it is more likely to incorrectly identify black and Asian people than their white counterparts on some settings. Following the latest testing conducted by the National Physical Laboratory (NPL) of the technology's application within the police national database, the Home Office said it was more likely to incorrectly include some demographic groups in its search results."
"Analysts who examined the police national database's retrospective facial recognition technology tool at a lower setting found that the false positive identification rate (FPIR) for white subjects (0.04 %) is lower than that for Asian subjects (4.0 %) and black subjects (5.5 %). The testing went on to find that the number of false positives for black women was particularly high. The FPIR for black male subjects (0.4 %) is lower than that for black female subjects (9.9 %), the report said."
"Facial recognition technology scans people's faces and then cross-references the images against watchlists of known or wanted criminals. It can be used while examining live video footage of people passing cameras and comparing their faces with those on wanted lists, or be used by officers to target individuals as they walk by mounted cameras. Images of suspects can also be run retrospectively through police, passport or immigration databases to identify them and check their backgrounds."
Testing by the National Physical Laboratory of the police national database's retrospective facial recognition tool found higher false-positive identification rates for Asian and Black people than for white people. At a lower sensitivity setting, FPIR was 0.04% for white subjects, 4.0% for Asian subjects and 5.5% for Black subjects. False positives for Black women were particularly high, with FPIR for Black female subjects at 9.9% versus 0.4% for Black male subjects. Ministers face calls for stronger safeguards and police leaders urge caution over national expansion. The technology can be used on live video or retrospective database searches and raises concerns about demographic bias.
Read at www.theguardian.com
Unable to calculate read time
[
|
]