In a recent update to its Search Central documentation, Google has clarified its position on unsupported fields in robots.txt files.
Google has stated that its crawlers don’t support fields not listed in its robots.txt documentation.
This clarification is part of Google’s efforts to provide unambiguous guidance to website owners and developers.
Google states:
“We sometimes get questions about fields that aren’t explicitly listed as supported, and we want to make it clear that they aren’t.”
This update should eliminate confusion and prevent websites from relying on unsupported directives.
According to the updated documentation, Google officially supports the following fields in robots.txt files:
While not explicitly stated, this clarification implies that Google doesn’t support commonly used directives like “crawl-delay,” although other search engines may recognize them.
Additionally, it’s worth noting that Google is phasing out support for the ‘noarchive‘ directive.
See also: 9 Tips To Optimize Crawl Budget For SEO
This update is a reminder to stay current with official guidelines and best practices.
It highlights the need to use documented features rather than assuming support for undocumented directives.
Consult Google’s official Search Central documentation for more detailed information on robots.txt implementation and best practices.
Featured Image: Thomas Reichhart/Shutterstock