
"This approach has borne fruit in the past, but not every SEO suggestion is a hit. The tumultuous current state of the Internet, defined by inconsistent traffic and rapidly expanding use of AI, may entice struggling publishers to try more SEO snake oil like content chunking. When traffic is scarce, people will watch for any uptick and attribute that to the changes they have made. When the opposite happens, well, it's just a bad day."
"You've made all these things that you did specifically for a ranking system, not for a human being because you were trying to be more successful in the ranking system, not staying focused on the human being. And then the systems improve, probably the way the systems always try to improve, to reward content written for humans. All that stuff that you did to please this LLM system that may or may not have worked, may not carry through for the long term."
Google provides only general SEO recommendations, forcing SEO practitioners to infer algorithm behavior from outcomes. The current Internet climate features inconsistent traffic and rapidly expanding AI use, which can tempt publishers to test speculative tactics such as content chunking. Short-term traffic increases often get attributed to those changes, while decreases are dismissed as normal variance. Content chunking can appear effective because of existing Google quirks, not because LLMs prefer split content. Google does not design LLMs to favor fragmented pages, and there may only be isolated edge cases. As systems evolve, strategies optimized for ranking signals rather than human readers risk losing effectiveness. Google considers chopping content for LLMs an unviable long-term SEO strategy, though publishers may continue while they perceive benefits.
Read at Ars Technica
Unable to calculate read time
Collection
[
|
...
]