When you add 10.000 pages that have no real intention to rank in the SERP, should you: "follow,noindex" or disallow the whole directory through robots? What is your opinion?
-
I just want a second opinion
The customer don't want to loose any internal linkvalue by vaporizing link value though a big amount of internal links. What would you do?
-
Hi Jeff,
Thanks for your answer. Please take a look to the reply above on Fredrico
-
Hi Federico,
In this case it's an affiliate website and the 10.000 pages are all prodcutpages. It's all coming from datafeeds so it's duplicate content.
We don't want to index this that's for sure.
So noindex,follow or disallow the whole directory or both...
We have our own opinion about this but I want to hear what others are thinking about this
Thanks in advanced!
-
Yep, I agree with belt and suspenders.
-
Wesley - I do agree with Federico.
That said, if they really don't want those pages indexed, use the belt-and-suspender method (if you wear both a belt and suspenders, chances are greater that your pants won't fall down).
I'd put a robot.txt file to disallow the indexing of the directory, and also no-index / no-follow each of the pages, too.
That way when they have someone working on the pages in the site and they change things to followed, you're still covered. Likewise, if someone blows away the robot.txt file.
Just my $0.02, but hope it helps…
-- Jeff -
What do they have? 10,000 pages of uninteresting content? a robots tag noindex,follow will do to leave them our of engines. But to decide you really need to know what those pages have. 10,000 isn't a few, and if there's value content worth sharing, a page could get a link, that if you disallow it through the robots, won't even flow pagerank.
It all comes down to what are those pages for...?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Highly ranked pages to new domain?
Hi everyone! We are ranked #1 for about 30 product pages at www.oldsite.com/product1 and we are wanting to move about 30 of those pages to a new site www.newsite.com/product1 (new domain and hosting - which we own). What is the best way to do this? I'm confused if you recreate those pages on the new domain vs. ftp move them, 301 re-directs, etc. Looking for the things we must do and the sequence to do it all, etc. Thanks so much!
Intermediate & Advanced SEO | | Jamesmcd030 -
Conditional Noindex for Dynamic Listing Pages?
Hi, We have dynamic listing pages that are sometimes populated and sometimes not populated. They are clinical trial results pages for disease types, some of which don't always have trials open. This means that sometimes the CMS produces a blank page -- pages that are then flagged as thin content. We're considering implementing a conditional noindex -- where the page is indexed only if there are results. However, I'm concerned that this will be confusing to Google and send a negative ranking signal. Any advice would be super helpful. Thanks!
Intermediate & Advanced SEO | | yaelslater0 -
Fix broken external links on noindex, follow pages no one visits?
Would you take the time to fix external links on your site on pages that are noindex, follow on pages that no one ever visits? The only reason to do it would be to present a tidier site to Google, but would it really care if those pages are noindex/folllow? The thing that makes it a non-trivial amount of work is that there are hundreds of these on a large site. Do you think Google cares, if they're noindex/follow? I know the safe answer is always fix everything, but really it has to get weighed against the likely benefit and other projects with a limited amount of time to work with. Best... Mike
Intermediate & Advanced SEO | | 945010 -
Strange 404s in GWT - "Linked From" pages that never existed
I’m having an issue with Google Webmaster Tools saying there are 404 errors on my site. When I look into my “Not Found” errors I see URLs like this one: Real-Estate-1/Rentals-Wanted-228/Myrtle-Beach-202/subcatsubc/ When I click on that and go to the “Linked From” tab, GWT says the page is being linked from http://www.myrtlebeach.com/Real-Estate-1/Rentals-Wanted-228/Myrtle-Beach-202/subcatsubc/ The problem here is that page has never existed on myrtlebeach.com, making it impossible for anything to be “linked from” that page. Many more strange URLs like this one are also showing as 404 errors. All of these contain “subcatsubc” somewhere in the URL. My Question: If that page has never existed on myrtlebeach.com, how is it possible to be linking to itself and causing a 404?
Intermediate & Advanced SEO | | Fuel0 -
Wordpress Tag Pages - NoIndex?
Hi there. I am using Yoast Wordpress Plugin. I just wonder if any test have been done around the effects of Index vs Noindex for Tag Pages? ( like when tagging a word relevant to an article ) Thanks 🙂 Martin
Intermediate & Advanced SEO | | s_EOgi_Bear0 -
Is my "term & conditions"-"privacy policy" and "About Us" pages stealing link juice?
should i make them no follow? or is this a bogus method?
Intermediate & Advanced SEO | | SEObleu.com0 -
Why is noindex more effective than robots.txt?
In this post, http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo, it mentions that the noindex tag is more effective than using robots.txt for keeping URLs out of the index. Why is this?
Intermediate & Advanced SEO | | nicole.healthline0 -
10,000 New Pages of New Content - Should I Block in Robots.txt?
I'm almost ready to launch a redesign of a client's website. The new site has over 10,000 new product pages, which contain unique product descriptions, but do feature some similar text to other products throughout the site. An example of the page similarities would be the following two products: Brown leather 2 seat sofa Brown leather 4 seat corner sofa Obviously, the products are different, but the pages feature very similar terms and phrases. I'm worried that the Panda update will mean that these pages are sand-boxed and/or penalised. Would you block the new pages? Add them gradually? What would you recommend in this situation?
Intermediate & Advanced SEO | | cmaddison0