Quickest way to deindex large parts of a website
-
Hey there,
my clients website was set up with subdirectories for almost every country in the world plus multiple languages in each country. The content in each subfolder is (almost) identical. So no surprise: They have a big problem with duplicate content and ranking fluctuations.
Since they don't want to change the site's structure I recommended limiting the languages available in each subfolder with robots.txt. However before doing this we marked the contents to be exluded with noindex, nofollow. It's only been 2 days now but I hardly notice any decline in the number of indexed pages.
I was therefore wondering if it would speed up things if I marked the pages with just noindex instead of noindex and nofollow.
It would be great if you could share your thoughts on that.
Cheers,
Jochen
Hey there,
my clients website was set up with subdirectories for almost every country in the world plus multiple languages in each country. The content in each subfolder is (almost) identical. So no surprise: They have a big problem with duplicate content and ranking fluctuations.
Since they don't want to change the site's structure I recommended limiting the languages available in each subfolder with robots.txt. However before doing this we marked the contents to be exluded wiht noindex, nofollow. It's only been 2 days now but I hardly notice any decline in the number of indexed pages.
I was therefore wondering if it would speed up things if I marked the pages with just noindex instead of noindex and nofollow.
It would be great if you could share your thoughts on that.
Cheers,Jochen -
Thanks for the hint Dirk! I've used the tool and it works great. I even found a handy chrome extension ("WebMaster Tools - Bulk URL removal") that made the removal of my 3,000 subdirectories very smooth and saved me about 25 hours of manual work!
WebMaster Tools - Bulk URL removal
-
Hi,
There was a similar question a few days ago: https://moz.com/community/q/is-there-a-limit-to-how-many-urls-you-can-put-in-a-robots-txt-file
Quote: Google Webmaster Tools has a great tool for this. If you go into WMT and select "Google index", then "remove URLs". You can use regex to remove a large batch of URLs then block them in robots.txt to make sure they stay out of the index.
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Fun and games -- not -- after website relaunch (Rails to Wordpress)
I'm after some advice. I've dropped 1 million pageviews in the last 30 days after a website relaunch a few weeks ago, and my revenue is a quarter of what it was, which is a massive worry. After 12 years of hard work and a much anticipated upgrade, it's been a nightmare. There were so many problems with the changeover that should have been done, but I have been fixing the aftermath up like crazy since then and nothing seems to be improving. I just want to know if there are any glaring issues I have missed that I can focus on. I have been working on ridiculous stuff like duplicate content from multiple imports (301 redirects and removing the dupes from google) and so much more. Feels like a bomb and I am stuck underneath it. Website is BellyBelly.com.au. Thanks in advance.
Intermediate & Advanced SEO | | BellyBellyKelly0 -
4 websites with same content?
I have 4 websites (1 Main, 3 duplicate) with same content. Now I want to change the content for duplicate websites and main website will remain the same content. Is there any problem with my thinking?
Intermediate & Advanced SEO | | marknorman0 -
301 Redirect from now defunct website?
Hi guys Quick question about 301 redirection between domains. I currently manage a website, lets call it website A. Website A sells a particular product range, however the decision has been made by the powers that be to pull the plug on the business and sell the products previously sold via Website A via another website within the parent companies control.....lets call it Website B. I need to make it clear to customers of Website A that the company no longer operates but want to pass the SEO equity that has been built up over time to the relevant pages on Website B. My plan was to 1. 301 Redirect all key landing pages on Website A to the most relevant pages on Website B 2. Initially keep the website A homepage live but change the message to say "Website A no longer operates, but Website B can help etc. etc." Remove all sub links from navigation. 3. Monitor referral and direct traffic levels and consider 301 redirecting website A homepage to Website B homepage in the long term. My questions: Does this sound like the best approach? If not, what alternatives are there? Will Website A look like a link farm for Website B? I dont want this obviously!
Intermediate & Advanced SEO | | DHS_SH0 -
Large number of pages crawled.
My campaign for printlabelandmail.com says that seomoz has crawled 619 pages. My site, however, only has a little over 250 pages. Where are these extra pages? I did recently relaunched my website with wordpress. I was using Dreamweaver before. I thought I deleted all the old pages. Could these extra pages be old pages from the site prior to my relaunch? I hope my question makes sense. Any insights would be helpful. Thanks! Andrea
Intermediate & Advanced SEO | | JimDirectMailCoach0 -
Moving Part of a Website to a Subdomain to Remove Panda Penalty?
I have lots of news on my website and unlike other types of content, news posts quickly become obsolete and get a high bounce rate. I have reasons to think that the news on my website might be partly responsible for a Panda penalty so I'm not sure. There are over 400 news posts on the blog from the last 4 years so that's still a lot of content. I was thinking of isolating the news articles on a subdomain (news.mywebsite.com) If the news play a part in the Panda penalty, would that remove it from the main domain?
Intermediate & Advanced SEO | | sbrault740 -
SEOmoz is only crawling 2 pages out of my website
I have checked on Google Webmaster and they are crawling around 118 pages our of my website, store.itpreneurs.com but SEOmoz is only crawling 2 pages. Can someone help me? Thanks Diogo
Intermediate & Advanced SEO | | jslusser0 -
Purpose of a Blog in a website
How internal blog or external blog is helpful in SEO?why it is good to have a site with blog?
Intermediate & Advanced SEO | | Alick3000 -
Optimising a Dynamic website ?
A client has bought the Nostalgia wp theme. I've installed Yoast but because the website is ajax based and the content for the pages are dynamically loaded the plugin won't work. Or at least not to my knowledge? The developer doesn't currently have a solution, which from previous expereience it will never be supported. So I need some possible solutions here. Create a mobile site? Cons more time, more money etc Create non dynamic pages linked in footer area. Cons page duplication etc. It's a small niche so having the basic elements is imperative to getting it ranking.
Intermediate & Advanced SEO | | StephenForde0