The Information Commissioner's Office completed a first-ever data protection audit of UK police use of facial recognition technology, examining South Wales Police and Gwent Police. The audit assessed necessity and proportionality, design fairness and accuracy, and end-to-end compliance with UK data protection law. Findings provided high assurance that processes and procedures at both forces comply with data protection requirements. Trained staff provide human oversight to mitigate discrimination and prevent solely automated decisions. A formal application process assesses necessity and proportionality before each live facial recognition deployment. Both forces have mapped data flows, can demonstrate lawful provenance of images for biometric templates, and maintain appropriate DPIAs.
The Information Commissioner's Office (ICO) has completed its first-ever data protection audit of UK police forces deploying facial recognition technologies (FRT), noting it is "encouraged" by its findings. The ICO's audit, which investigated how South Wales Police and Gwent Police are using and protecting people's personal information when deploying facial recognition, marks the first time the data regulator has formally audited a UK police force for its use of the technology.
According to an executive summary published on 20 August, the scope of the facial recognition audit - which was agreed with the two police forces beforehand - focused on questions of necessity and proportionality (a key legal test for the deployment of new technologies), whether its design meets expectations around fairness and accuracy, and whether "the end-to-end process" is compliant with the UK's data protection rules.
"The forces made sure there was human oversight from trained staff to mitigate the risk of discrimination and ensure no decisions are solely automated, and a formal application process to assess the necessity and proportionality before each LFR deployment," she wrote. The executive summary added that South Wales Police and Gwent Police have "comprehensively mapped" their data flows, can "demonstrate the lawful provenance" of the images used to generate biometric templates, and have appropriate data protection impact assessments (DPIAs) in place.
Collection
[
|
...
]