When bots look like buyers: agentic traffic causing new publisher headaches
Briefly

When bots look like buyers: agentic traffic causing new publisher headaches
"The term agentic visitors describes when AI assistants or automated agents - like OpenAI's ChatGPT and now its browser Atlas, Perplexity or Google AI Overviews - visit and read a publisher's content on behalf of a human user. Often, without generating a traditional pageview or ad impression so publishers can't make money on the visit. TollBit, which helps publishers charge AI bots for content access,already mentioned in its latest Q2 report (published just before Atlas hit the market) that the next wave of AI visitors increasingly look like humans on sites."
"They pose a sticky problem for publishers and the third-party vendors reporting them. They're not bots, but they're not people either. They're the crawlers that hit a publisher's site for information to answer a question or prompt typed into an AI engine by individuals who are increasingly using them as their de facto search engines. So technically, they're a bot seeking information that a human has requested."
""Agents and bots mimicking humans is an anti-pattern that has dangerous implications, such as eroding advertiser trust," said Olivia Joslin, co-founder and COO of TollBit."
Agentic visitors are AI assistants or automated agents that fetch and read publisher content on behalf of human users. They frequently visit without generating traditional pageviews or ad impressions, preventing publishers from monetizing those visits. Recent AI browsers and agents increasingly appear indistinguishable from human traffic in site logs, creating measurement ambiguity. Publishers and third-party vendors struggle to track and value these visits, and advertisers have begun pausing spend when agentic traffic spikes. Companies like TollBit offer tools to charge AI bots, but the broader industry still lacks reliable methods to separate agentic visitors from real humans.
Read at Digiday
Unable to calculate read time
[
|
]