Quickest way to deindex a large number of pages
-
Our site was recently hacked by spammers posting fake content and bringing down our servers, etc. After a few months, we finally figured out what was going on and fixed the issue. However, it turns out that Google has indexed 26K+ spammy pages and we've lost page rank and search engine rankings as a result.
What is the best and fastest way to get these pages out of Google's index?
-
Given that I'm sure you've removed these pages from your site, there will be no page to which to add a meta-noindex tag.
Disallowing these pages in robots.txt in no way signals to the search engines that they should be removed from the index, just that they should no longer be crawled. Given that they're already indexed, blocking in robots.txt would potentially save some "crawl budget" but wouldn't do anything to remove them from the index.
So submitting them to the URL Removal Tool would be by far the most effective, along with an explanation.
You'll also want to keep a very close watch on your penalty warnings within Webmaster Tools. If you get flagged, you'll want a complete history of the issue and the steps you've taken to address it in order to prepare a reinclusion request.
Lastly, don't forget to submit these same URLs to the Bing Webmaster Tools Block URLs tool. You may not get a massive amount of traffic from Bing, but there's no sense throwing it away, since you've already prepared the URL removal list anyway.
Hope that helps?
Paul
-
Yup. Just wanted to add as well that if these pages are in a particular directory, then you can deindex the entire directory in one command using the URL removal tool.
-
Disallow in robots.txt
Add a noindex meta tag to these pages
Request Google to remove the URLs from their index via WMT URL removal request
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is effective way to increase Moz DA,and PA?
This is my site which im welling ti increase Moz rank give some recommendations? https://bubbaofficial.com/
Algorithm Updates | | TopGames1010 -
Multiple links from same domain (different pages) considered in credibility of backlinks?
Hi, Let's say there are multiple backlinks from different pages of same domain to different pages of other domain like below: Website A: Page 1 -----------> Website B: Page 1 Website A: Page 2 -----------> Website B: Page 2 Do the pages of Website B pages will get backlinks authority equally or they don't get much backlinks impact as they have multiple backlinks from same domain? There were old school stories that Google ignores second link from same domain.....etc... So, please suggest on this. Thank you. Note: The question is NOT about content relevancy or domain authority score of the backlinks.
Algorithm Updates | | vtmoz1 -
Page content is not very similar but topic is same: Will Google considers the rel canonical tags?
Hi Moz community, We have multiple pages from our own different sub-domains for same topics. These pages even rank in SERP for related keywords. Now we are planning to show only one of the pages in SERP. We cannot redirect unfortunately. We are planning to use rel canonical tags. But the page content is not same, only 20% is similar and 80% is different but the context is same. If we use rel canonicals, does Google accepts this? If not what should I do? Making header tags similar works? How Google responds if content is not matching? Just ignore or any negative score? Thanks
Algorithm Updates | | vtmoz0 -
What does it exactly means when Google brings the "brand name" to the beggining of the page title in search results when it was actually given at the end?
We see many times...page titles starts with "brand name: page for etc" where actually "brand name" has been given at the end and keywords at beginning. Why does Google make this change? I noticed this happens when similar title tags are used by multiple websites for high difficulty keywords. Thanks
Algorithm Updates | | vtmoz0 -
Dealing with Omitted Page
For my most competitive term, the wrong page ranks (and not well either). The landing page I built for it has never shown up for that term except after I include the omitted results. The page that does rank is category page page above it. All that's fine, because neither page was all that great...BUT, I have completely re-written the content for the landing page, got local area pictures, local testimonials and a video. So here's my question: Should I put all that content on the landing page that's been omitted or tweak the page that ranks and put it there? To me it makes the most sense to put the content on the page that has been omitted, but I don't know how google treats pages that have been omitted in the past. Is it going to have some sort of bias against the page, because it was omitted so many times earlier for that keyword? Or, will it be treated just like any other page, and if the content is good enough, then it will rank just fine. If anyone's dealt with this, then I'd love to hear all about it! Thanks, Ruben
Algorithm Updates | | KempRugeLawGroup0 -
Title of home page is changed to domain name in SERPs
Hi, We have a unique problem, we are getting a totally different title in Google serps for a large site. When we search with domain name with space in google.com. We are getting title as domain name with space. We don't have any Open Directory listing. We don't have any cannonical issues and other pages with title as domain name. Can you please tell us what we have to do get our original title back in SERP ? Thanks, With Regards,
Algorithm Updates | | semshah1430 -
Why would my product pages no longer be indexed in Google?
Our UK site has 72 pages in our sitemap. 30 of them are product pages which take a productid parameter. Prior to 1st Feb 2011, all pages were indexed in Google but since then all of our product pages seem to have dropped from the index? If I check in webmaster tools, I can see that we have submitted 72 pages and 42 are indexed. I realise we should have some better url structuring and I'm working on that but do you have any ideas on how we can get our product poages back into googles index http://www.ebacdirect.com
Algorithm Updates | | ebacltd0 -
Google said that low-quality pages on your site may affect rankings on other parts
One of my sites got hit pretty hard during the latest Google update. It lost about 30-40% of its US traffic and the future does not look bright considering that Google plans a worldwide roll-out. Problem is, my site is a six year old heavy linked, popular Wordpress blog. I do not know why the article believes that it is low quality. The only reason I came up with is the statement that low-quality pages on a site may affect other pages (think it was in the Wired article). If that is so, would you recommend blocking and de-indexing of Wordpress tag, archive and category pages from the Google index? Or would you suggest to wait a bit more before doing something that drastically. Or do you have another idea what I could to do? I invite you to take a look at the site www.ghacks.net
Algorithm Updates | | badabing0