
"The Atlantic has built a scorecard for AI crawlers, identifying which bots actually send readers back and which just strip content. Only those with value get through. This approach led it to block a single AI crawler that tried to recrawl its site 564,000 times in the past seven days. Some publishers have taken a hard line on AI crawlers, blocking all that they don't have a licensing deal with."
"The Atlantic kicked off this AI bot-blocking rating system this summer, when Thompson and chief product officer Gitesh Gohel started tracking how much their site was being scraped by AI crawlers without their permission. They used Cloudflare's tool, which had launched three weeks earlier, giving all its publisher customers the ability to block AI crawlers by default. They charted in a spreadsheet which crawlers were hitting its site, and which led to referral traffic and subscription conversions."
The Atlantic developed a scorecard to evaluate AI crawlers by whether they send readers back or merely strip content. The outlet blocked a crawler that recrawled the site 564,000 times in seven days. The Atlantic maintains a licensing deal with OpenAI but requires other crawlers to demonstrate referral traffic or subscription conversions before unblocking them. The team used Cloudflare's tool to block crawlers by default and tracked crawler activity and conversions in a spreadsheet. The publisher's goal is to force AI engines to pay licensing fees if they want access to content to improve LLM outputs. Executives report most AI platforms currently drive negligible traffic.
Read at Digiday
Unable to calculate read time
Collection
[
|
...
]