Effect of Submitted URLs being blocked by robots

by Conor Grocock   Last Updated September 17, 2019 11:04 AM - source

I've got an alert on my site for pages being blocked by Robots, and am in the process of fixing it. But I'm curious what the ranking effect is, if there is one at all?

And if there is an effect, does it scale with the number of pages blocked?



Answers 1


One should distinguish: blocked pages vs. blocked ressources.

If pages are blocked by robots, the bot can't come in and read. But, if there are links to blocked page, it could nevertheless be indexed. The SERP result of indexed, but blocked page is ugly - no snippet, only alert "the page is blocked".

Getting such alert means, the bot tried to visit the blocked page. How? Maybe through sitemap. This is a common fail to include pages blocked by robots into sitemap. One gives opposite signals to the bot: index it vs. don't read it.

The ranking effect is, as it is easy to imagine - no ranking. If the bot doesn't read the page, it doesn't know to what term is it relevant.

If the number of blocked pages steadily grows, Google will in general downgrade quality factors of the whole site, for the first it will sent the bot less often to crawl it.

Evgeniy
Evgeniy
September 17, 2019 10:55 AM

Related Questions


Crawling only the sitemap on google webmaster tools

Updated January 26, 2018 17:04 PM