Is it good to put X-Robots-Tag for 410 pages which are still in Google's Index?

by Sanjay Kumar   Last Updated October 18, 2019 09:04 AM - source

Google is still indexing 410 pages which was implemented 2-3 month ago. These pages should have gone away at this time.

So, Should it be logical to implement X-Robots-Tag: Noindex,Noarchive while having 410 http status?

What's your suggestion?

Tags : 410-gone


Answers 2


Look at your log-files - is googlebot visited 410 pages since they become 410? If yes - just wait, if not - make a sitemap containing only 410 pages and upload it into search console.

Evgeniy
Evgeniy
October 18, 2019 08:06 AM

No, there is no need for that. As John Mueller said in Webmasters central

From our point of view, in the mid term/long term, a 404 is the same as a 410 for us. So in both of these cases, we drop those URLs from our index.

It is normal if Google stills crawl those URLs from time to time:

We’ll still go back and recheck and make sure those pages are really gone or maybe the pages have come back alive again.

If those pages are still indexed it could be because they don't have much popularity and Googlebot doesn't crawl them very often. Just wait or use the Remove URLs tool to speed up the process.

Emirodgar
Emirodgar
October 18, 2019 08:51 AM

Related Questions