Our site is built on WordPress.
Whilst in the development stage, we had the option to Discourage search engines from indexing this site checked (Settings Reading).
We have now made the site live and unchecked this option.
Yesterday I submitted a sitemap into Googles Search Console but a lot of the sitemap paths are coming up with the warning:
Sitemap contains URLs which are blocked by robots.txt.
As far as I can tell, none of the site (Apart from the
/wp-admin/) URLs are being blocked by our
robots.txt file. I have tested it in the Search Console and that is saying it is fine.
I read some articles which say that the file can be cached for a little while, but it has now been a day since submitting it.
Is there anything I am missing or can do to stop this warning being thrown?
User-agent: * Disallow: /wp-admin/ Allow: /wp-admin/admin-ajax.php
A few examples of the URLs which are being shown as blocked: