Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Which pages to "noindex"
-
I have read through the many articles regarding the use of Meta Noindex, but what I haven't been able to find is a clear explanation of when, why or what to use this on.
I'm thinking that it would be appropriate to use it on:
legal pages such as privacy policy and terms of use
search results page
blog archive and category pagesThanks for any insight of this.
-
Here are two posts that may be helpful in both explaining how to set up a robots.txt for wordpress, and the thinking behind setting up which parts to exclude.
http://www.cogentos.com/bloggers-guide-to-using-robotstxt-and-robots-meta-tags-to-optimise-indexing/
http://codex.wordpress.org/Search_Engine_Optimization_for_WordPress#Robots.txt_Optimization
The wordpress link (second link) has a link to several other resources as well.
-
Yes I'm using wordpress.
-
You also want to block any admin directory, plugin directory, etc. Are you using Wordpress or a specific CMS? There are often best-practice posts for robots.txt files for specific platforms.
-
yes, generally you would noindex your about us, contact us, privacy, terms pages since these are rarely searched and in fact are so heavily linked to internally that they would rank well if indexed.
all search results should be noindexed - google wants to do the search

definitely NOT blog/category pages - these are your gold content!
I also noindex any URL accessed by https
-
As well as pagination pages I have read, but not done it myself, that you should consider using it on low value pages that you are wouldn't want to rank above other pages on the site (hopefully they wouldn't anyway) and also sitemaps as don't necessarily want them to appear in the index but definitely want them followed.
-
Noindexed pages are pages that you want your link juices flowing through, but not have them rank as individual entries in the search engines.
-
I think your legal pages should rank as individual pages. If I wanted to find your privacy policy and searched for 'privacy policy company name', I'd expect to find an entry where I can click and find your privacy policy
-
Your search results page (the internal ones) are great candidates for a noindex attribute. If a search engine robot happens to stumble upon one (via a link from somebody else for example), you'd want the spider to start crawling pages from there and spreading link juice over your site. However, under most circumstances you don't want this result page to rank on itself in the search engines, as it usually offers thin value to your visitors
-
Blog archive and category pages are useful pages to visitors and I personally wouldn't noindex these
Bonus: your paginated results ('page 2+ in a result set that has multiple pages') are great candidates for noindex. It'll keep the juices running, without having all these pretty much meaningless (and highly dynamic) pages in the search index.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
"Noindex, follow" for thin pages?
Hey there Mozzers, I have a question regarding Thin pages. Unfortunately, we have Thin pages, almost empty to be honest. I have the idea to ask the dev team to do "noindex, follow" on these pages. What do you think? Has someone faced this situation before? Will appreciate your input!
Technical SEO | | Europarl_SEO_Team0 -
Home Page Ranking Instead of Service Pages
Hi everyone! I've noticed that many of our clients have pages addressing specific queries related to specific services on their websites, but that the Home Page is increasingly showing as the "ranking" page. For example, a plastic surgeon we work with has a page specifically talking about his breast augmentation procedure for Miami, FL but instead of THAT page showing in the search results, Google is using his home page. Noticing this across the board. Any insights? Should we still be optimizing these specific service pages? Should I be spending time trying to make sure Google ranks the page specifically addressing that query because it SHOULD perform better? Thanks for the help. Confused SEO :/, Ricky Shockley
Technical SEO | | RickyShockley0 -
Should i index or noindex a contact page
Im wondering if i should noindex the contact page im doing SEO for a website just wondering if by noindexing the contact page would it help SEO or hurt SEO for that website
Technical SEO | | aronwp0 -
Wordpress "incoming search terms" plugin
Hello everyone! newbie to SEO and have been trying to keep everything nice and ethical but I've seen on a couple of blogs today "incoming search terms" at the bottom of the blogs, then a bullet pointed list of search terms beneath it. So I had a quick search about the use of it and noticed wordpress has a plugin that automatic ally generates these "incoming search terms". I ask is this a legitimate plugin or will this harm my blog? I assume it generally will as I can't see this being much use for the audience, rather it would be 100% for trying to lure in search engines.
Technical SEO | | acecream0 -
Why are pages still showing in SERPs, despite being NOINDEXed for months?
We have thousands of pages we're trying to have de-indexed in Google for months now. They've all got . But they simply will not go away in the SERPs. Here is just one example.... http://bitly.com/VutCFiIf you search this URL in Google, you will see that it is indexed, yet it's had for many months. This is just one example for thousands of pages, that will not get de-indexed. Am I missing something here? Does it have to do with using content="none" instead of content="noindex, follow"? Any help is very much appreciated.
Technical SEO | | MadeLoud0 -
"nofollow pages" or "duplicate content"?
We have a huge site with lots of geographical-pages in this structure: domain.com/country/resort/hotel domain.com/country/resort/hotel/facts domain.com/country/resort/hotel/images domain.com/country/resort/hotel/excursions domain.com/country/resort/hotel/maps domain.com/country/resort/hotel/car-rental Problem is that the text on ie. /excursions is often exactly the same on .../alcudia/hotel-sea-club/excursion and .../alcudia/hotel-beach-club/excursion The two hotels offer the same excursions, and the intro text on the pages are the exact same throughout the entire site. This is also a problem on the /images and /car-rental pages. I think in most cases the only difference on these pages is the Title, description and H1. These pages do not attract a lot of visits through search-engines. But to avoid them being flagged as duplicate content (we have more than 4000 of these pages - /excursions, /maps, /car-rental, /images), do i add a nofollow-tag to these, do i block them in robots.txt or should i just leave them and live with them being flagged as duplicate content? Im waiting for our web-team to add a function to insert a geographical-name in the text, so i could add ie #HOTELNAME# in the text and thereby avoiding the duplicate text. Right now we have intros like: When you visit the hotel ... instead of: When you visit Alcudia Sea Club But untill the web-team has fixed these GEO-tags, what should i do? What would you do and why?
Technical SEO | | alsvik0 -
What is best practice for redirecting "secondary" domain names?
For sites with multiple top-level domains that have been secured for a business or organization, I'm curious as to what is considered best practice for setting up 301 redirects for secondary domains. Is it best to do the 301 redirects at the registrar level, or the hosting level? So that .net, .biz, or other secondary domains funnel visitors to the correct primary/main domain name. I'm looking for the "best practice" answer and want to avoid duplicate content problems, or penalties from the search engines. I'm not trying to game the system with dozens of domain names, simply the handful of domains that are important to the client. I've seen some registrars recommend hosting secondary domains, and doing redirects from the hosting level (and they use meta refresh for "domain forwarding," which I want to avoid). It seems rather wasteful to set up hosting for a secondary domain and then 301 each URL.
Technical SEO | | Scott-Thomas0