The best guide to spotting AI writing comes from Wikipedia | TechCrunch
Briefly

The best guide to spotting AI writing comes from Wikipedia | TechCrunch
"To start with, the guide confirms what we already know: automated tools are basically useless. Instead, the guide focuses on habits and turns of phrase that are rare on Wikipedia but common on the internet at large (and thus, common in the model's training data). According to the guide, AI submissions will spend a lot of time emphasizing why a subject is important, usually in generic terms like "a pivotal moment" or "a broader movement.""
"With millions of edits coming in each day, there's plenty of material to draw on, and in classic Wikipedia-editor style, the group has produced a field guide that's both detailed and heavy on evidence. The guide flags a particularly interesting quirk around tailing clauses with hazy claims of importance. Models will say some event or detail is "emphasizing the significance" of something or other, or "reflecting the continued relevance" of some general idea"
Since 2023, Wikipedia editors launched Project AI Cleanup to flag AI submissions across millions of daily edits. Automated detection tools perform poorly; human-observed stylistic habits prove more revealing. AI-generated prose often emphasizes why a subject matters using generic phrases like "a pivotal moment" or "a broader movement." AI text inflates notability by citing minor media appearances and attaches trailing clauses that lodge hazy claims of significance, such as "emphasizing the significance" or "reflecting the continued relevance." Large edit volumes provide abundant evidence to identify recurring AI patterns.
Read at TechCrunch
Unable to calculate read time
[
|
]