SEO articles

You Don’t Need Robots.txt On Root Domain, Says Google

Google’s Gary Illyes shares an unconventional but valid method for centralizing robots.txt rules on CDNs.You Don’t Need Robots.txt On Root Domain, Says Google

Robots.txt files can be centralized on CDNs, not just root domains.
Websites can redirect robots.txt from main domain to CDN.
This unorthodox approach complies with updated standards.

In a recent LinkedIn post, Google Analyst Gary Illyes challenged a long-standing belief about the placement of robots.txt files.

You can read more SEO articles

For years, the conventional wisdom has been that a website’s robots.txt file must reside at the root domain (e.g., example.com/robots.txt).

However, Illyes has clarified that this isn’t an absolute requirement and revealed a lesser-known aspect of the Robots Exclusion Protocol (REP).

Robots.txt File Flexibility
The robots.txt file doesn’t have to be located at the root domain (example.com/robots.txt).

According to Illyes, having two separate robots.txt files hosted on different domains is permissible—one on the primary website and another on a content delivery network (CDN).

Illyes explains that websites can centralize their robots.txt file on the CDN while controlling crawling for their main site.

For instance, a website could have two robots.txt files: one at https://cdn.example.com/robots.txt and another at https://www.example.com/robots.txt.

This approach allows you to maintain a single, comprehensive robots.txt file on their CDN and redirect requests from their main domain to this centralized file.

Illyes notes that crawlers complying with RFC9309 will follow the redirect and use the target file as the robotstxt file for the original domain.

Looking Back At 30 Years Of Robots.txt
As the Robots Exclusion Protocol celebrates its 30th anniversary this year, Illyes’ revelation highlights how web standards continue to evolve.

See also  Google Gives 5 SEO Insights On Google Trends

He even speculates whether the file needs to be named “robots.txt,” hinting at possible changes in how crawl directives are managed.

Follow HiTrend on X

Rate this post

Trend Admin

Stay with us for all the trend news of the day

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button