404's being re-indexed
-
Hi All,
We are experiencing issues with pages that have been 404'd being indexed. Originally, these were /wp-content/ index pages, that were included in Google's index. Once I realized this, I added in a directive into our htaccess to 404 all of these pages - as there were hundreds. I tried to let Google crawl and remove these pages naturally but after a few months I used the URL removal tool to remove them manually.
However, Google seems to be continually re/indexing these pages, even after they have been manually requested for removal in search console. Do you have suggestions? They all respond to 404's.
Thanks
-
Just to follow up - I have now actually 410'd the pages and the 410's are still being re-indexed.
-
I'll check this one out as well, thanks! I used a header response extension which reveals the presence of x-botots headers called web developer.
-
First it would be helpful to know how you are detecting that it isn't working. What indexation tool are you using to see whether the blocks are being detected? I personally really like this one: https://chrome.google.com/webstore/detail/seo-indexability-check/olojclckfadnlhnlmlekdihebmjpjnoa?hl=en-GB
Or obviously at scale - Screaming Frog
-
Thank you for the quick response,
The pages are truly removed, however, because there were so many of these types of pages that leaked into the index, I added a redirect to keep users on our site - no intentions of being "shady", I just didn't want hundreds of 404's getting clicked and causing a very high bounce rate.
For the x-robots header, could you offer some insight into why my directive isn't working? I believe it's a regex issue on the wp-content. I have tried to troubleshoot to no avail.
<filesmatch <strong="">"(wp-content)">
Header set X-Robots-Tag: "noindex, nofollow"</filesmatch>I appreciate the help!
-
Well if a page has been removed and has not been moved to a new destination - you shouldn't redirect a user anyway (which kind of 'tricks' users into thinking the content was found). That's actually bad UX
If the content has been properly removed or was never supposed to be there, just leave it at a 410 (but maybe create a nice custom 410 page, in the same vein as a decent UX custom 404 page). Use the page to admit that the content is gone (without shady redirects) but to point to related posts or products. Let the user decide, but still be useful
If the content is actually still there and, hence you are doing a redirect - then you shouldn't be serving 404s or 410s in the first place. You should be serving 301s, and just doing HTTP redirects to the content's new (or revised) destination URL
Yes, the HTTP header method is the correct replacement when the HTML implementation gets stripped out. HTTP Header X-Robots is the way for you!
-
Thank you! I am in the process of doing so, however with a 410 I can not leave my JS redirect after the page loads, this creates some UX issues. Do you have any suggestions to remedy this?
Additionally, after the 410 the non x-robots noindex is now being stripped so it only resolves to a 410 with no noindex or redirect. I am still working on a noindex header, as the 410 is server-side, I assume this would be the only way, correct?
-
You know that 404 means "temporarily gone but will be coming back" right? By saying a page is temporarily unavailable, you actively encourage Google to come back later
If you want to say that the page is permanently gone use status code 410 (gone)
Leave the Meta no-index stuff in the HTTP header via X-Robots, that was a good call. But it was a bad call to combine Meta no-index and 404, as they contradict each other ("don't index me now but then do come back and index me later as I'll probably be back at some point")
Use Meta no-index and 410, which agree with each other ("don't index me now and don't bother coming back")
-
Yes, all pages have a noindex. I have also tried to noindex them using htaccess, to add an extra layer of security, but it seems to be incorrect. I believe it is an issue with the regex. Attempting to match anything with wp-content.
<filesmatch "(wp-content)"="">Header set X-Robots-Tag: "noindex, nofollow"</filesmatch>
-
Back to basics. Have you marked those pages/posts as 'no-index'. With many wp plugins, you can no-index them in bulk then submit for re-indexation.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's going on with google index - javascript and google bot
Hi all, Weird issue with one of my websites. The website URL: http://www.athletictrainers.myindustrytracker.com/ Let's take 2 diffrenet article pages from this website: 1st: http://www.athletictrainers.myindustrytracker.com/en/article/71232/ As you can see the page is indexed correctly on google: http://webcache.googleusercontent.com/search?q=cache:dfbzhHkl5K4J:www.athletictrainers.myindustrytracker.com/en/article/71232/10-minute-core-and-cardio&hl=en&strip=1 (that the "text only" version, indexed on May 19th) 2nd: http://www.athletictrainers.myindustrytracker.com/en/article/69811 As you can see the page isn't indexed correctly on google: http://webcache.googleusercontent.com/search?q=cache:KeU6-oViFkgJ:www.athletictrainers.myindustrytracker.com/en/article/69811&hl=en&strip=1 (that the "text only" version, indexed on May 21th) They both have the same code, and about the dates, there are pages that indexed before the 19th and they also problematic. Google can't read the content, he can read it when he wants to. Can you think what is the problem with that? I know that google can read JS and crawl our pages correctly, but it happens only with few pages and not all of them (as you can see above).
Technical SEO | | cobano0 -
Google's Omitted Results - Attempt to De-Index
We're trying to get webpages from our QA site out of Google's index. We've inserted the NOINDEX tags. Google now shows only 3 results (down from 196,000), however, they offer a link to "show omitted results" at the bottom of the page. (A) Did we do something wrong? or (B) were we successful with our NOINDEX but Google will offer to show omitted results anyway? Please advise! Thanks!
Technical SEO | | BVREID0 -
Should we re-use our old re-directed domain for a new wesbite ?
Hi One our clients has an old domain that has been redirected to another website of his. Now he is asking us to build a new website for that domain and direct it back. This is new website will be very relative to it's own old content and where it has been re-directed recently. I guess the only benefit of this would be taking advantage of the age of this domain. Do you recommend doing that or getting him a new domain ?
Technical SEO | | Dynamo-Web0 -
What is the best practice to re-index the de-indexed pages due to a bad migration
Dear Mozers, We have a Drupal site with more than 200K indexed URLs. Before 6 months a bad website migration happened without proper SEO guidelines. All the high authority URLs got rewritten by the client. Most of them are kept 404 and 302, for last 6 months. Due to this site traffic dropped more than 80%. I found today that around 40K old URLs with good PR and authority are de-indexed from Google (Most of them are 404 and 302). I need to pass all the value from old URLs to new URLs. Example URL Structure
Technical SEO | | riyas_
Before Migration (Old)
http://www.domain.com/2536987
(Page Authority: 65, HTTP Status:404, De-indexed from Google) After Migration (Current)
http://www.domain.com/new-indexed-and-live-url-version Does creating mass 301 redirects helps here without re-indexing the old URLS? Please share your thoughts. Riyas0 -
Strange URL's indexed
Hi, I got the message "Increase in not found errors" (404 errors) in GWT for one of my website. I did not change anything but I now see a lot of "strange" URL's indexed (~50) : &ui=2&tf=1&shva=1 &cat_id=6&tag_id=31&Remark=In %22%3EAny suggestion on how to fix it ?Erwan
Technical SEO | | johnny1220 -
New domain's Sitemap.xml file loaded to old domain - how does this effect SEO?
I have a client who recently changed their domain when they redesigned their site. The client wanted the old site to remain live for existing customers with links to the new domain. I guess as a workaround, the developer loaded the new domain's sitemap.xml file to the old domain. What SEO ramifications would this have if any on the primary (new) domain?
Technical SEO | | julesae0 -
What's the SEO impact of url suffixes?
Is there an advantage/disadvantage to adding an .html suffix to urls in a CMS like WordPress. Plugins exist to do it, but it seems better for the user to leave it off. What do search engines prefer?
Technical SEO | | Cornucopia0 -
Domain Transfer Process / Bulk 301's Using IIS
Hi guys - I am getting ready to do a complete domain transfer from one domain to another completely different domain for a client due to a branding/name change. 2 things - first, I wanted to lay out a summary of my process and see if everyone agrees that its a good approach, and second, my client is using IIS, so I wanted to see if anyone out there knows a bulk tool that can be used to implement 301's on the hundreds of pages that the site contains? I have found the process to redirect each individual page, but over hundreds its a daunting task to look at. The nice thing about the domain transfer is that it is going to be a literal 1:1 transfer, with the only things changing being the logo and the name mentions. Everything else is going to stay exactly the same, for the most part. I will use dummy domain names in the explanation to keep things easy to follow: www.old-domain.com and www.new-domain.com. The client's existing home page has a 5/10 GPR, so of course, transferring Mojo is very important. The process: Clean up existing site 404's, duplicate tags and titles, etc. (good time to clean house). Create identical domain structure tree, changing all URL's (for instance) from www.old-domain.com/freestuff to www.newdomain.com/freestuff. Push several pages to a dev environment to test (dev.new-domain.com). Also, replace all instances of old brand name (images and text) with new brand name. Set up 301 redirects (here is where my IIS question comes in below). Each page will be set up to redirect to the new permanent destination with a 301. TEST a few. Choose lowest traffic time of week (from analytics data) to make the transfer ALL AT ONCE, including pushing new content live to the server for www.new-domain.com and implementing the 301's. As opposed to moving over parts of the site in chunks, moving the site over in one swoop avoids potential duplicate content issues, since the content on the new domain is essentially exactly the same as the old domain. Of course, all of the steps so far would apply to the existing sub-domains as well, IE video.new-domain.com. Check for errors and problems with resolution issues. Check again. Check again. Write to (as many as possible) link partners and inform them of new domain and ask links to be switched (for existing links) and updated (for future links) to the new domain. Even though 301's will redirect link juice, the actual link to the new domain page without the redirect is preferred. Track rank of targeted keywords, overall domain importance and GPR over time to ensure that you re-establish your Mojo quickly. That's it! Ok, so everyone, please give me your feedback on that process!! Secondly, as you can see in the middle of that process, the "implement 301's" section seems easier said than done, especially when you are redirecting each page individually (would take days). So, the question here is, does anyone know of a way to implement bulk 301's for each individual page using IIS? From what I understand, in an Apache environment .htaccess can be used, but I really have not been able to find any info regarding how to do this in bulk using IIS. Any help here would be GREATLY APPRECIATED!!
Technical SEO | | Bandicoot0