
"Emily Keaney, deputy commissioner for the Information Commissioner's Office (ICO), said the regulator only learned last week about historical bias in the algorithm used by UK police forces for retrospective facial recognition (RFR) within the Police National Database (PND). "It's disappointing that we had not previously been told about this, despite regular engagement with the Home Office and police bodies as part of our wider work to hold government and the public sector to account on how data is being used in their services."
""While we appreciate the valuable role technology can play, public confidence in its use is paramount, and any perception of bias and discrimination can exacerbate mistrust. The ICO is here to support and assist the public sector to get this right." The ICO has requested urgent clarity from the Home Office to assess the situation and determine next steps. Keaney's comments follow updated accuracy tests published on December 4, conducted by the National Physical Laboratory and commissioned by the Home Office."
The ICO criticized the Home Office for not disclosing historical bias in the retrospective facial recognition algorithm used with the Police National Database. The regulator only recently learned about those biases and has asked the Home Office for urgent clarity to decide next steps. Independent accuracy tests by the National Physical Laboratory compared Cognitec FaceVACS-DBScan ID v5.5 (currently in use) and Idemia MBSS FR (planned). Idemia performed nearly perfectly in both ideal and realistic scenarios, while Cognitec showed significant weaknesses for certain demographics when strict similarity thresholds were applied to eliminate false positives. Public confidence and discrimination concerns were highlighted.
Read at Theregister
Unable to calculate read time
Collection
[
|
...
]