What we learned from a failed Nota News experiment - Poynter
Briefly

What we learned from a failed Nota News experiment - Poynter
"The use of limited sources was not consistently followed by the journalist contractors we hired. Stories were published that drew from local news reporting without attribution, in some cases lifting content directly."
"We want to be clear: this was human error. It was not caused by AI and no AI was used in the creation of plagiarized content."
"Accountability is about the systems we built, the oversight we failed to enforce, and the standards we didn't make strong enough in practice."
Nota launched 11 hyperlocal news sites to provide local stories using AI-assisted tools and limited public sources. However, journalist contractors occasionally violated guidelines by copying content from other news outlets without attribution. This was identified by Axios and documented by Poynter. The issue stemmed from human error, not AI, as the workflow was designed to use specific sources. Accountability lies in the systems and oversight that were insufficiently enforced, not in the use of AI tools in journalism.
Read at Poynter
Unable to calculate read time
[
|
]