Google On When Robots.txt Is Unreachable, Other Pages Reachability Matter
Briefly

Gary Illyes from Google emphasized the importance of having key pages like the homepage available when the robots.txt returns a 503 error for extended periods.
Carlos Sánchez Donate initiated a significant discussion on LinkedIn about the implications of a persistent robots.txt 503 error and its impact on site crawlers.
Illyes noted that if crawlers can access important pages during a robots.txt error, the site is in a limbo state but may not be penalized as harshly.
The conversation prompted questions about the clarity of robots.txt 5xx error handling in Google's documentation, highlighting the ongoing concerns of SEOs.
Read at Search Engine Roundtable
[
|
]