What if robots.txt disallows itself?

by sag   Last Updated August 10, 2018 04:04 AM - source

User-agent: *
Disallow: /robots.txt

What happens if you do this? Will search engines crawl robots.txt once and then never crawl it again?

Tags : robots.txt

Related Questions

Using the robots file in a different way

Updated March 14, 2017 14:04 PM

Help with robots meta tag and X-Robots-Tag

Updated December 21, 2016 08:01 AM