'Sitemap contains urls which are blocked by robots.txt.' Warning - However robots.txt file doesn't appear to be blocking anything

by Peter Rowlands   Last Updated July 04, 2017 09:04 AM - source

Our site is built on WordPress.

Whilst in the development stage, we had the option to Discourage search engines from indexing this site checked (Settings Reading).

We have now made the site live and unchecked this option.

Yesterday I submitted a sitemap into Googles Search Console but a lot of the sitemap paths are coming up with the warning:

Sitemap contains URLs which are blocked by robots.txt.

As far as I can tell, none of the site (Apart from the /wp-admin/) URLs are being blocked by our robots.txt file. I have tested it in the Search Console and that is saying it is fine.

I read some articles which say that the file can be cached for a little while, but it has now been a day since submitting it.

Is there anything I am missing or can do to stop this warning being thrown?

Sitemap - https://www.justaccounts.com/sitemap_index.xml

Robots File:

User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php

A few examples of the URLs which are being shown as blocked:

  • https://www.example.com/page-sitemap.xml
  • https://www.example.com/category-sitemap.xml
  • https://www.example.com/attachment-sitemap.xml

Related Questions

Crawling only the sitemap on google webmaster tools

Updated January 26, 2018 17:04 PM

Effect of Submitted URLs being blocked by robots

Updated September 17, 2019 11:04 AM