Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Canonical & noindex? Use together
-
For duplicate pages created by the "print" function,
seomoz says its better to use noindex (http://www.seomoz.org/blog/complete-guide-to-rel-canonical-how-to-and-why-not)
and JohnMu says its better to use canonical http://www.google.com/support/forum/p/Webmasters/thread?tid=6c18b666a552585d&hl=en
What do you think?
-
I'm working to remove low quality pages from a directory while at the same time allowing a few high quality pages in the same directory to be spidered and indexed. To do this I placed a robots noindex tag on the low quality pages we don't want indexed.
This noindex tags where implemented yesterday, but the low quality pages aren't going away. I even used "Fetch as Googlebot" to force the crawl on a few of the low quality pages. Maybe I need to give them a few days to disappear, but this got me thinking: "Why would Google ignore a robots noindex tag?" Then I came up with a theory. I noticed that we include a canonical tag by default on every page of our site including the ones I want to noindex. I've never used a noindex tag in conjunction with a canonical tag, so maybe the canonical tag is confusing the SE spiders.
I did some research and found a quote from Googler JohnMu in the following article: http://www.seroundtable.com/archives/020151.html It's not an exact match to my situation because our canonical tag points to itself, rather than another URL. But it does sound like using them together is a bad idea.
Has anyone used or seen canonical and noindex tags together in the wild? Can anyone confirm or deny this theory that the canonical screws up the efficacy of the meta robots tag?
- topic:timeago_earlier,2 years
-
I agree with Lindsay's reasoning but am not clear on her statement on this subject: "If your website's print pages include a link back to the original page, you can use the meta robots 'noindex' tag here too. The page stays out of the index and any link value will be passed back to the original, canonical, web version of the page."
If you add the "noindex" tag to the print page, search engines will disregard the page which SHOULD leave them with only the canonical version of the page. You are requiring the search engine to do some guessing which is what we want to avoid. By using the canonical tag, we are expressly telling the search engine the correct version of the page to index.
From the above quote, it sounds like Lindsay is suggesting to use both "noindex" and the canonical tag. The focus of her article is there are superior methods of canonicalizing web pages without using the canonical tag, so it leaves me unclear on the logic.
I use the canonical tag presently in these situations. I would love to ask Lindsay for additional clarification on the reasoning for the "noindex" tag in this instance. The last blog comment was a question asked in May which was never responded to, so it seems like she doesn't visit the site too often.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Using H3 before or instead of an H2...
My designer and I have been having an argument: we have a blog with short, 400 words posts. They have an H1 with nice keywords and a catchy title, and then a few subheadings. I don't like making the subheadings H2, because the font looks way too large in Wordpress, so my designer wants to make them all H4s, so the font looks to be a nicer size. Here's my problem with that and why I usually just bold the subheadings: Is it really bad to put a bunch of H4s right under an H1, with not H2's or 3's to separate? I'm reading different arguments on the internet about this and gladly welcome more debate and/or case studies. Thank you!
Intermediate & Advanced SEO | Feb 9, 2018, 9:41 PM | genevieveagar0 -
Using hreflang for international pages - is this how you do it?
My client is trying to achieve a global presence in select countries, and then track traffic from their international pages in Google Analytics. The content for the international pages is pretty much the same as for USA pages, but the form and a few other details are different due to how product licensing has to be set up. I don’t want to risk losing ranking for existing USA pages due to issues like duplicate content etc. What is the best way to approach this? This is my first foray into this and I’ve been scanning the MOZ topics but a number of the conversations are going over my head,so suggestions will need to be pretty simple 🙂 Is it a case of adding hreflang code to each page and creating different URLs for tracking. For example:
Intermediate & Advanced SEO | Sep 14, 2016, 12:35 PM | Caro-O
URL for USA: https://company.com/en-US/products/product-name/
URL for Canada: https://company.com/en-ca/products/product-name /
URL for German Language Content: https://company.com/de/products/product-name /
URL for rest of the world: https://company.com/en/products/product-name /1 -
Should I Keep adding 301s or use a noindex,follow/canonical or a 404 in this situation?
Hi Mozzers, I feel I am facing a double edge sword situation. I am in the process of migrating 4 domains into one. I am in the process of creating URL redirect mapping The pages I am having the most issues are the event pages that are past due but carry some value as they generally have one external followed link. www.example.com/event-2008 301 redirect to www.newdomain.com/event-2016 www.example.com/event-2007 301 redirect to www.newdomain.com/event-2016 www.example.com/event-2006 301 redirect to www.newdomain.com/event-2016 Again these old events aren't necessarily important in terms of link equity but do carry some and at the same time keep adding multiple 301s pointing to the same page may not be a good ideas as it will increase the page speed load time which will affect the new site's performance. If i add a 404 I will lose the bit of equity in those. No index,follow may work since it won't index the old domain nor the page itself but still not 100% sure about it. I am not sure how a canonical would work since it would keep the old domain live. At this point I am not sure which direction I should follow? Thanks for your answers!
Intermediate & Advanced SEO | Jul 27, 2015, 1:25 PM | Ideas-Money-Art0 -
Should I use change of address when moving to subdomain
Hi guys So we had a domain that was only for one country, www.example.com 1 year later we decided to go to another country so we will have all the current website under a country subdomain like : ae.example.com we did a 301 redirect
Intermediate & Advanced SEO | Jun 1, 2015, 8:39 AM | awrikat
Should I perform a change of address action from www.example.com to ae.example.com ? please help
Thanks0 -
Block in robots.txt instead of using canonical?
When I use a canonical tag for pages that are variations of the same page, it basically means that I don't want Google to index this page. But at the same time, spiders will go ahead and crawl the page. Isn't this a waste of my crawl budget? Wouldn't it be better to just disallow the page in robots.txt and let Google focus on crawling the pages that I do want indexed? In other words, why should I ever use rel=canonical as opposed to simply disallowing in robots.txt?
Intermediate & Advanced SEO | Jul 23, 2014, 11:19 AM | YairSpolter0 -
Partial duplicate content and canonical tags
Hi - I am rebuilding a consumer website, and each product page will contain a unique product image, and a sentence or two about the product (and we tend to use a lot of the same words in different ways across products). I'd like to have a tabbed area below the product info that talks about the overall product line, and this content would be duplicate across all the product pages (a "Why use our products" type of thing). I'd have this duplicate content also living on its own URL's so they can be found alone in the SERP's. Question is, do I need to add the canonical tag to this page, since there's partial duplicate content on the product pages? And if I did that, would my product pages go un-indexed?? I understand how to handle completely duplicated content, it's the partial duplicate that I'm having difficulty figuring out.
Intermediate & Advanced SEO | Jan 7, 2014, 6:18 PM | Jenny10 -
Yoast & rel canonical for paginated Wordpress URLs
Hello, our Wordpress blog at http://www.jobs.ca/career-resources has a rel canonical issue since we added pagination to the front page and category-pages. We're using Yoast and it's incorrectly applying a rel-canonical meta tag referencing page 1 on page 2, 3, etc. This is a known misuse of the rel-canonical tag (per Google's Webmaster Blog - http://googlewebmastercentral.blogspot.ca/2013/04/5-common-mistakes-with-relcanonical.html, which says rel-canonical should be replaced with rel-prev and rel-next for page 2, 3, etc.). We don't see a way to specify anywhere in Yoast's options to correct this behaviour for page 2, 3, etc. Yoast allows you to override a page's canonical URL, otherwise it automatically uses the Wordpress permalink. My question is, does anyone know how to configure Yoast to properly replace rel-canonical tags with rel-prev and rel-next for paginated URLs, or do I need to look at another plugin or customize the behavior directly in my child theme code? This issue was brought up here as well: http://moz.com/community/q/canonical-help, but the only response did not relate to Yoast. (We're using Wordpress 3.6.1 and Yoast "Wordpress SEO" 1.4.18)
Intermediate & Advanced SEO | Oct 18, 2013, 4:04 PM | aactive0 -
Best way to noindex an image?
Hi all, A client wanted a few pages noindexed, which was no problem using the meta robots noindex tag. However they now want associated images removed, some of which still appear on pages that they still want indexed. I added the images to their robots.txt file a few weeks ago (probably over a month ago actually) but they're all still showing when you do an image search. What's the best way to noindex them for good, and how do I go about implementing it? Many thanks, Steve
Intermediate & Advanced SEO | Jul 3, 2013, 11:57 AM | steviephil0