Google Clarified Support For Robots.txt Fields
Briefly

"Additionally, Google made this modification in response to frequent inquiries about unsupported fields, providing clarity for webmasters to avoid confusion and ensure proper site crawling."
"As Jaskaran Singh noted, using unsupported fields like crawl-delay leads to Google ignoring them altogether, emphasizing the importance of sticking to supported directives for optimal crawling."
"Pro Tip: Always use comments (#) for better readability in your robots.txt file and ensure that all paths begin with / to avoid any crawling errors."
Read at Search Engine Roundtable
[
|
]