
"But the disturbing implications of this technology are clear as soon as you consider that AI can be used just as easily to make deepfakes that incriminate the poor, the marginalized, and the already over-policed-folks for whom guilt is the default conclusion with the flimsiest evidence. What happens when racist police, convinced as they so often are of a suspect's wrongdoing based solely on their evidence of their Blackness, are presented with AI-generated video "proof"?"
"What about when law enforcement officials, who are already legally permitted to use faked incriminating evidence to dupe suspects into confessing-real-life examples have included forged DNA lab reports, phony polygraph test results, and falsified fingerprint " matches"-start regularly using AI to manufacture "incontrovertible evidence" for the same? How long until, as legal scholars Hillary B. Farber and Anoo D. Vyasin suggest,"
An AI video app called Sora produced a hyperrealistic deepfake of Sam Altman shoplifting, demonstrating unprecedented realism in user-generated synthetic video. The technology enables anyone to make, post, and remix phony videos, making satire harmless when aimed at powerful figures but dangerous when used against marginalized people. Law enforcement already uses fabricated evidence in interrogations, including forged lab reports, phony polygraphs, and falsified fingerprint matches. Widespread AI deepfakes could let police present manufactured video 'proof' or fake witnesses and accomplices to coerce confessions. Legal scholars and professors warn that such tools could exacerbate racial bias and further criminalize the over-policed.
Read at The Nation
Unable to calculate read time
Collection
[
|
...
]