Google Comments On Serving Markdown Pages To LLM Crawlers
Briefly

Google Comments On Serving Markdown Pages To LLM Crawlers
"Are you sure they can even recognize MD on a website as anything other than a text file? Can they parse & follow the links? What will happen to your site's internal linking, header, footer, sidebar, navigation? It's one thing to give it a MD file manually, it seems very different to serve it a text file when they're looking for a HTML page."
"This morning I made a small change to my site: I made every page available as Markdown for AI agents and crawlers. I expected maybe a trickle. Within an hour, I was seeing hundreds of requests from ClaudeBot, GPTBot, and OpenAI's SearchBot. 😲 https://t.co/UD0h22AZEC- Dries Buytaert (@Dries) January 14, 2026"
Serving raw Markdown to LLM crawlers can create recognition and parsing problems because crawlers may treat Markdown as a simple text file rather than an HTML page. Crawlers may not parse or follow links in Markdown, potentially breaking internal linking, headers, footers, sidebars, and navigation. Providing a Markdown file manually differs from serving Markdown as the default response when crawlers expect HTML. Converting pages solely to Markdown could harm site structure and discoverability. Practical experiments show making pages available as Markdown can trigger rapid bot activity from ClaudeBot, GPTBot, and OpenAI's SearchBot.
Read at Search Engine Roundtable
Unable to calculate read time
[
|
]