
"We do not fear advances in technology - but we do have legitimate concerns about some of the products on the market now... AI continues to develop and we are hopeful that we will reach a point in the near future where these reports can be relied on. For now, our office has made the decision not to accept any police narratives that were produced with the assistance of AI."
"In July of this year, EFF published a report on how Axon designed Draft One to defy transparency. Police upload their body-worn camera's audio into the system, the system generates a report that the officer is expected to edit, and then the officer exports the report. But when they do that, Draft One erases the initial draft, and with it any evidence of what portions of the report were written by AI and what portions were written by an officer."
Generative AI tools for police reporting have proliferated rapidly, with Axon's Draft One becoming the most popular and Axon the largest U.S. body-worn camera provider. Draft One ingests body-camera audio, generates a draft report for officer editing, and erases the initial draft upon export, removing evidence of what text was AI-generated versus officer-written. This erasure prevents courts or oversight bodies from verifying original narratives and assessing officer credibility. Some prosecuting offices, such as King County, have barred AI-assisted narratives because current AI products are unreliable and nontransparent. Bundling of surveillance and AI tools lowers procurement friction and raises public concern.
Read at Electronic Frontier Foundation
Unable to calculate read time
Collection
[
|
...
]