Urgent clarity' sought over racial bias in UK police facial recognition technology
Briefly

Urgent clarity' sought over racial bias in UK police facial recognition technology
"In a statement responding to the report, Emily Keaney, the deputy commissioner for the Information Commissioner's Office, said the ICO had asked the Home Office for urgent clarity on this matter in order for the watchdog to assess the situation and consider our next steps. The next steps could include enforcement action, including issuing a legally binding order to stop using the technology or fines, as well as working with the Home Office and police to make improvements."
"Last week we were made aware of historical bias in the algorithm used by forces across the UK for retrospective facial recognition within the police national database. We acknowledge that measures are being taken to address this bias. However, it's disappointing that we had not previously been told about this, despite regular engagement with the Home Office and police bodies as part of our wider work to hold government and the public sector to account on how data is being used in their services."
Testing by the National Physical Laboratory of the police national database found facial recognition technology is more likely to incorrectly match Black and Asian people than white counterparts. The Home Office acknowledged higher incorrect inclusion rates for some demographic groups. The Information Commissioner's Office has asked the Home Office for urgent clarity so the watchdog can assess the situation and consider next steps, which could include enforcement orders, fines, or working with police to make improvements. The ICO said measures are being taken to address historical bias but expressed disappointment at not being informed earlier. Police and crime commissioners urged caution over plans for a national system.
Read at www.theguardian.com
Unable to calculate read time
[
|
]