
"According to a comparative study of LFR trials by law enforcement agencies in London, Wales, Berlin and Nice, although "in-the-wild" testing is an important opportunity to collect information about how artificial intelligence (AI)-based systems like LFR perform in real-world deployment environments, the trails conducted so far have failed to take into account the socio-technical impacts of the systems in use, or to generate clear evidence of the operational benefits."
"Without this, the authors said "we worry that such tests will be little more than 'show trials' - public performances used to legitimise the use of powerful and invasive digital technologies in support of controversial political agendas for which public debate and deliberation is lacking, while deepening governmental reliance on commercially developed technologies which fall far short of the legal and constitutional standards which public authorities are required to uphold"."
Real-world testing of live facial recognition (LFR) systems by UK and European police has largely been ungoverned, with technology tested on local populations without adequate safeguards or oversight. In-the-wild trials can provide information about AI-based system performance in deployment environments but have so far failed to consider socio-technical impacts or to produce clear evidence of operational benefit. Clear guidance and governance frameworks are needed to ensure trials are epistemically, legally and ethically responsible. Without such safeguards, tests risk legitimising intrusive technologies, deepening reliance on commercial systems, and undermining privacy and civil liberties.
#live-facial-recognition #law-enforcement-surveillance #governance-and-oversight #privacy-and-civil-liberties
Read at ComputerWeekly.com
Unable to calculate read time
Collection
[
|
...
]