
"First, it seems highly probable the algorithms required to police people's devices will deliver false positives, likely including fine art portraits. It's important to note that instances of this have already happened in response to the UK's poorly-crafted and badly implemented Online Safety Act (OSA): one social media post of a painting by Francisco de Goya was restricted for UK users, reports the BBC."
"Second, in the event the OS does detect a false positive, what happens next? Does the law imply perfectly innocent culture vultures will end up having to explain themselves to the authorities for daring to look at art? The third and biggest negative consequence is the same as it has always been: once you have smartphone operating systems working to analyze the content on your devices for one thing, what is to stop those systems working to identify other forms of content on the device?"
UK policy to have smartphone operating systems analyze device content for illegal material risks widespread unintended consequences. Automated algorithms will produce false positives, sometimes blocking benign items such as fine-art portraits; a social media post of a Francisco de Goya painting was already restricted for UK users under the Online Safety Act. False detections could force perfectly innocent users to explain lawful behavior to authorities. Enabling operating systems to inspect personal devices for one category of content creates a pathway to detect other content types. Authoritarian governments could compel similar surveillance, prompting population-wide shifts toward VPNs and other circumvention tools.
Read at Computerworld
Unable to calculate read time
Collection
[
|
...
]