How would you handle this duplicate content - noindex or canonical?
-
Hello
Just trying look at how best to deal with this duplicated content.
On our Canada holidays page we have a number of holidays listed (PAGE A)
http://www.naturalworldsafaris.com/destinations/north-america/canada/suggested-holidays.aspxWe also have a more specific Arctic Canada holidays page with different listings (PAGE B)
http://www.naturalworldsafaris.com/destinations/arctic-and-antarctica/arctic-canada/suggested-holidays.aspxOf the two, the Arctic Canada page (PAGE B) receives a far higher number of visitors from organic search.
From a user perspective, people expect to see all holidays in Canada (PAGE A), including the Arctic based ones. We can tag these to appear on both, however it will mean that the PAGE B content will be duplicated on PAGE A.
Would it be the best idea to set up a canonical link tag to stop this duplicate content causing an issue. Alternatively would it be best to no index PAGE A?
Interested to see others thoughts. I've used this (Jan 2011 so quite old) article for reference in case anyone else enters this topic in search of information on a similar thing:
Duplicate Content: Block, Redirect or Canonical - SEO Tips
-
OK, I think I understand what you are asking now.
Canonicals are for identical or near-identical pages. I don't know that those two pages would be considered to be identical, even after you added the arctic listings to the Canada page, especially as the above-the-fold content is different.
Keep in mind that the "penalty" for duplicate content is that Google will choose only one page to show, depending on which one it thinks is most relevant. And if you have one page that gets a lot more traffic and engagement, that is likely to be the one Google chooses, anyway.
If I were you, I'd probably make sure the description sections at the top of those pages each has a good bit of unique content and maybe I'd change the titles and h1s to make them a little more different from each other (if you can do that) then I'd just leave it at that and see what Google makes of it.
If it seems that your higher traffic page starts to lose traffic, you can always add the canonicals then, and resubmit the URL through Fetch as Google in Webmaster Tools.
-
Hi both
Thank you.
Linda - It's people arriving at the Canada page who want to see all Canada, not the other way round. People select Canada as a destination but are also interested in our Arctic Canada trips.
The Canada page itself doesn't rank well or act as a landing page portal, however it is important in terms of site structure as people check that destination to see if we do trips there once they reach the site. People equally come onto the site looking for a trip to the Arctic as a destination so we do need both within the site in terms of the user journey.
The canonical tag would be my preference - if there is enough unique content on both pages do you think it matters if the holidays list is the same - this could be an alternative although we won't escape a percentage of duplication?
-
I don't recommend no following either page. The Canonical tag should help with the duplicate content errors. If it were my site I would list all of the holidays on one page only by combining the two pages together. If you use the Canonical tag you will decrease your chances of having both pages rank, however you will be telling the engines which page is the authoritative page.
-
First, are you sure that the people who are arriving at the arctic page really want to see all of the holidays and not the arctic ones? The arctic page is pretty well optimized for "arctic", and it is in the title and description. Take a look in your Webmaster Tools at those pages and see which keywords are bringing them up.
If you have a good reason to think that people really want the more general page (page A) but it is not getting a lot of traffic, putting that content on the arctic page (page B) probably won't solve your problem as there is obviously some reason page A is not doing as well and you are just spreading around the content that is not working.
I don't think your answer lies in making the pages duplicates--you should actually be making them more different from each other so the arctic one is very clearly specific for arctic trips and the overview one for general inquiries.
And in the meantime you could put a prominent link at the top of your arctic page linking back to the overview page, saying something like, "For more ideas, see all of our suggested holidays." (In fact there should be a link like that on each of your specialty pages, pointing back to the general page--that will help build the authority of page A and help it rank higher in the SERPs.)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to fix Duplicate Content Warnings on Pagination? Indexed Pagination?
Hi all! So we have a Wordpress blog that properly has pagination tags of rel="prev" and rel="next" set up for pages, but we're still getting crawl errors with MOZ for duplicate content on all of our pagination pages. Also, we are having all of our pages indexed as well. I'm talking pages as deep as page 89 for the home page. Is this something I should ignore? Is it hurting my SEO potentially? If so, how can I start tackling it for a fix? Would "noindex" or "nofollow" be a good idea? Any help would be greatly appreciated!
Intermediate & Advanced SEO | | jampaper0 -
Penalties for duplicate content
Hello!We have a website with various city tours and activities listed on a single page (http://vaiduokliai.lt/). The list changes accordingly depending on filtering (birthday in Vilnius, bachelor party in Kaunas, etc.). The URL doesn't change. Content changes dynamically. We need to make URL visible for each category, then optimize it for different keywords (for example city tours in Vilnius for a list of tours and activities in Vilnius with appropriate URL /tours-in-Vilnius).The problem is that activities overlap very often in different categories, so there will be a lot of duplicate content on different pages. In such case, how severe penalty could be for duplicate content?
Intermediate & Advanced SEO | | jpuzakov0 -
Different language with direct translation: duplicate content, meta?
For a site that does NOT want a separate subdomain, or directory, or TLD for a country/language would the directly translated page (static) content/meta be duplicate? (NOT considering a translation of the term/acronym which could exist in another language) i.e. /SEO-city-state in English vs. /SEO-city-state Spanish -In this example a term/acronym that is the same in any language. Outside of duplicate content, are their other conflict potentials in rankings you can think of?
Intermediate & Advanced SEO | | bozzie3110 -
[E-commerce] Duplicate content due to color variations (canonical/indexing)
Hello, We currently have a lot of color variations on multiple products with almost the same content. Even with our canonicals being set, Moz's crawling tool seems to flag them as duplicate content. What we have done so far: Choosing the best-selling color variation (our "master product") Adding a rel="canonical" to every variation (with our "master product" as the canonical URL) In my opinion, it should be enough to address this issue. However, being given the fact that it's flagged as duplicate by Moz, I was wondering if there is something else we should do? Should we add a "noindex,follow" to our child products and "index,follow" to our master product? (sounds to me like such a heavy change) Thank you in advance
Intermediate & Advanced SEO | | EasyLounge0 -
Duplicate content on sites from different countries
Hi, we have a client who currently has a lot of duplicate content with their UK and US website. Both websites are geographically targeted (via google webmaster tools) to their specific location and have the appropriate local domain extension. Is having duplicate content a major issue, since they are in two different countries and geographic regions of the world? Any statement from Google about this? Regards, Bill
Intermediate & Advanced SEO | | MBASydney0 -
How to Avoid Duplicate Content Issues with Google?
We have 1000s of audio book titles at our Web store. Google's Panda de-valued our site some time ago because, I believe, of duplicate content. We get our descriptions from the publishers which means a good
Intermediate & Advanced SEO | | lbohen
deal of our description pages are the same as the publishers = duplicate content according to Google. Although re-writing each description of the products we offer is a daunting, almost impossible task, I am thinking of re-writing publishers' descriptions using The Best Spinner software which allows me to replace some of the publishers' words with synonyms. I have re-written one audio book title's description resulting in 8% unique content from the original in 520 words. I did a CopyScape Check and it reported "65 duplicates." CopyScape appears to be reporting duplicates of words and phrases within sentences and paragraphs. I see very little duplicate content of full sentences
or paragraphs. Does anyone know whether Google's duplicate content algorithm is the same or similar to CopyScape's? How much of an audio book's description would I have to change to stay away from CopyScape's duplicate content algorithm? How much of an audio book's description would I have to change to stay away from Google's duplicate content algorithm?0 -
Advice needed on how to handle alleged duplicate content and titles
Hi I wonder if anyone can advise on something that's got me scratching my head. The following are examples of urls which are deemed to have duplicate content and title tags. This causes around 8000 errors, which (for the most part) are valid urls because they provide different views on market data. e.g. #1 is the summary, while #2 is 'Holdings and Sector weightings'. #3 is odd because it's crawling the anchored link. I didn't think hashes were crawled? I'd like some advice on how best to handle these, because, really they're just queries against a master url and I'd like to remove the noise around duplicate errors so that I can focus on some other true duplicate url issues we have. Here's some example urls on the same page which are deemed as duplicates. 1) http://markets.ft.com/Research/Markets/Tearsheets/Summary?s=IVPM:LSE http://markets.ft.com/Research/Markets/Tearsheets/Holdings-and-sectors-weighting?s=IVPM:LSE http://markets.ft.com/Research/Markets/Tearsheets/Summary?s=IVPM:LSE&widgets=1 What's the best way to handle this?
Intermediate & Advanced SEO | | SearchPM0 -
Duplicate Content on Blog
I have a blog I'm setting up. I would like to have a mini-about block set up on every page that gives very brief information about me and my blog, as well as a few links to the rest of the site and some social sharing options. I worry that this will get flagged as duplicate content because a significant amount of my pages will contain the same information at the top of the page, front and center. Is there anything I can do to address this? Is it as much of a concern as I am making it? Should I work on finding some javascript/ajax method for loading that content into the page dynamically only for normal browser pageviews? Any thoughts or help would be great.
Intermediate & Advanced SEO | | grayloon0