Why is noindex more effective than robots.txt?
-
In this post, http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo, it mentions that the noindex tag is more effective than using robots.txt for keeping URLs out of the index. Why is this?
-
Good answer, also we have seen that sometimes bots come directly to your site via a link and do not always visit the robots.txt file and will therefore index the first page they come to.
Matt Cutts has said before that the only 100% fail safe way of blocking search engines indexing something will be to have it password protected
-
The disallow in robots.txt will prevent the bots from crawling that page, but will not prevent the page from appearing on SERPs. If a page with a lot of links to it is disallowed in the robots.txt, it may still appear on SERPs. I've seen this on a few of my own pages... and Google picks a weird title for the page...
If you put the meta noindex tag, Google will actively remove that page from their search results when they re-index that page.
Here was one webmaster central thread I found about it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is domain authority lost if you create a 301 redirect but mark it as noindex, nofollow?
Hi everyone, Our company sells products in various divisions. While we've been selling Product A and Product B under our original brand, we've recently created a new division with a new domain to focus on a Product B. The new domain has virtually no domain authority (3) while the original domain has some (37). We want customers to arrive on the new domain when they search for key search terms related to Product B instead of the pages that previously existed on our main website. If we create 301 redirects for the pages and content on the main site and add noindex, nofollow tags, will we lose the domain authority that we have from our original domain because the pages now have the noindex, nofollow tags? I read a few blog posts from Moz that said there isn't any domain authority lost with 301 redirects but I'm not sure if that is true if the pages are noindex, nonofollow. Do you follow? 🙂 Apologies for the lengthy post. Love this community and the great Moz team. Thanks, Joe
Intermediate & Advanced SEO | | jgoehring-troy0 -
Pages blocked by robots
**yazılım sürecinde yapılan bir yanlışlıktı.** Sorunu hızlı bir şekilde nasıl çözebilirim? bana yardım et. ```[XTRjH](https://imgur.com/a/XTRjH)
Intermediate & Advanced SEO | | mihoreis0 -
Robots.txt vs noindex
I recently started working on a site that has thousands of member pages that are currently robots.txt'd out. Most pages of the site have 1 to 6 links to these member pages, accumulating into what I regard as something of link juice cul-d-sac. The pages themselves have little to no unique content or other relevant search play and for other reasons still want them kept out of search. Wouldn't it be better to "noindex, follow" these pages and remove the robots.txt block from this url type? At least that way Google could crawl these pages and pass the link juice on to still other pages vs flushing it into a black hole. BTW, the site is currently dealing with a hit from Panda 4.0 last month. Thanks! Best... Darcy
Intermediate & Advanced SEO | | 945010 -
Meta NOINDEX and links into the pages?
If I have internal links pointing to pages that are META NO INDEX, will Google still index them? Or does that only apply to pages that are linked to from an external domain? Thanks!
Intermediate & Advanced SEO | | bjs20100 -
Can Dramatically Increasing Site Size Have Negative Effects?
I have a site with about 1000 pages. I'm planning to add about 30,000 pages to it. Can increasing the footprint by such an amount all of a sudden have any negative consequences for existing organic or hoped-for benefits from new pages? Would the site draw any increased scrutiny from Google for doing this? Any other considerations? Thanks... Darcy
Intermediate & Advanced SEO | | 945010 -
Can you use more than one meta robots tag per page?
If you want to add both "noindex, follow" and "noopd" should you add two meta robots tags or is there a way to combine both into one?
Intermediate & Advanced SEO | | nicole.healthline0 -
What is the effect on using jQuery sliders for content on SEO?
I know using css in subversive manners gets you dinged for points. I didnt know if JS counted the same since you are essentially hiding parts of the content and showing it in intervals as slides. The goal would be having key items for a client in divs and rotating those divs via a slider plugin as slides. I was just curious if that effected things in any way. Thanks! ~Paul
Intermediate & Advanced SEO | | peb72680 -
10,000 New Pages of New Content - Should I Block in Robots.txt?
I'm almost ready to launch a redesign of a client's website. The new site has over 10,000 new product pages, which contain unique product descriptions, but do feature some similar text to other products throughout the site. An example of the page similarities would be the following two products: Brown leather 2 seat sofa Brown leather 4 seat corner sofa Obviously, the products are different, but the pages feature very similar terms and phrases. I'm worried that the Panda update will mean that these pages are sand-boxed and/or penalised. Would you block the new pages? Add them gradually? What would you recommend in this situation?
Intermediate & Advanced SEO | | cmaddison0