Duplicate content for hotel websites - the usual nightmare? is there any solution other than producing unique content?
-
Hiya Mozzers I often work for hotels. A common scenario is the hotel / resort has worked with their Property Management System to distribute their booking availability around the web... to third party booking sites - with the inventory goes duplicate page descriptions sent to these "partner" websites.
I was just checking duplication on a room description - 20 loads of duplicate descriptions for that page alone - there are 200 rooms - so I'm probably looking at 4,000 loads of duplicate content that need rewriting to prevent duplicate content penalties, which will cost a huge amount of money.
Is there any other solution? Perhaps ask booking sites to block relevant pages from search engines?
-
Hi Kurt - very true - they should be taking the time for sure. I think part of problem is legacy of duplicate content - glad I'm not in their shoes!
Yup - rewriting is what I'm doing for those guys - inc new ideas for engaging content. Will let you know how it goes - an interesting project for me as never worked with a directory before!
-
Happy to help.
You may actually want to recommend to the brokers that they take the time to create original content. It's in their best interest since I assume they get paid for booking rooms/properties and they'd probably book more if they got more traffic by having original content.
In regards to that directory site, it's likely Google just decided they weren't the version of the content they wanted to display. If everything else is fine with that site, I'd bet just rewriting the pages to have original content (not just spun) would change their situation dramatically.
-
Thanks for your wise feedback EGOL - appreciated.
-
Hi Kurt and thanks for your great feedback there - funnily enough have just been writing unique content for these TPIs this week - so they have something different to work if they don't want to grapple with duplicate content issues - I've noticed the clever guys are now employing their own copywriters to produce unique content, yet many do not.
Just been looking at stats for a certain directory site and they've progressively lost traffic since panda struck - there's absolutely nothing wrong with their website (just completed site audit) beyond heavy duplication issues (as they've been copying and pasting property descriptions through to own site).
-
This is exactly the kind of situation where rel=canonical is supposed to be used. Rarely is there going to be 100% exact match because in most cases the use of the duplicated content is on different sites which have different headers, footers, nav menus, etc.
Put the canonical tag on your own site and then ask the booking sites if they would put them on their pages, indicating that your page is the canonical page. If they won't, then publish your page a week or so before you give out the content to the booking sites, making sure to use the canonical tag on your own site. That way, Google can find it first.
Another option would be to write unique content for your own site and then send out something different to all the booking sites. Yes, they will all have duplicate content, but your site won't. So, you should rank just fine and they will have to compete to see who can get in the listings.
Keep in mind that there isn't really a duplicate content penalty. When Google sees duplicates, they just don't include all of the duplicates in their search results. They choose the one they think it the canonical version and the others are left out. Not every page gets listed, but no site is penalized either.
Kurt Steinbrueck
OurChurch.Com -
I agree with EGOL and was going to suggest the same thing rel=canonical
-
It is supposed to be used on exact match duplicates. However, I know that it works on less than exact match. How far it can be stretched, I have no idea.
-
Can you use rel=canonical effectively if the duplication of a page is extensive yet only partial? in this instance I'm sometimes seeing say 3 paragraph room descriptions - e.g. 1st para carbon copy, yet para 2 and 3 include duplicate content and some new content.
-
rel=canonical (if you started with original content and can get everyone everywhere to use it and none of it gets stolen)
-
Hi luke,
I guess using the noindex parameter would be the best option here, no?
Best reagrds,
Michel
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Same site serving multiple countries and duplicated content
Hello! Though I browse MoZ resources every day, I've decided to directly ask you a question despite the numerous questions (and answers!) about this topic as there are few specific variants each time: I've a site serving content (and products) to different countries built using subfolders (1 subfolder per country). Basically, it looks like this:
Intermediate & Advanced SEO | | GhillC
site.com/us/
site.com/gb/
site.com/fr/
site.com/it/
etc. The first problem was fairly easy to solve:
Avoid duplicated content issues across the board considering that both the ecommerce part of the site and the blog bit are being replicated for each subfolders in their own language. Correct me if I'm wrong but using our copywriters to translate the content and adding the right hreflang tags should do. But then comes the second problem: how to deal with duplicated content when it's written in the same language? E.g. /us/, /gb/, /au/ and so on.
Given the following requirements/constraints, I can't see any positive resolution to this issue:
1. Need for such structure to be maintained (it's not possible to consolidate same language within one single subfolders for example),
2. Articles from one subfolder to another can't be canonicalized as it would mess up with our internal tracking tools,
3. The amount of content being published prevents us to get bespoke content for each region of the world with the same spoken language. Given those constraints, I can't see a way to solve that out and it seems that I'm cursed to live with those duplicated content red flags right up my nose.
Am I right or can you think about anything to sort that out? Many thanks,
Ghill0 -
Posting same content to different high authority websites
Let's say 1 article piece is highly relevant to multiple states and we pitch this article across the different domains in those states. Each article piece will be tweaked to localize the content. I understand that Google devalues links coming from low quality, websites that are spun up, but what about links that are basically the same content (but localized), across different high authority domains?
Intermediate & Advanced SEO | | imjonny1230 -
Hreflang tag could solve any duplicate content problems on the different versions??
I have run across a couple of articles recently suggesting that using the hreflang tag could solve any SEO problems associated with having duplicate content on the different versions (.co.uk, .com, .ca, etc). here is an example here: http://www.emarketeers.com/e-insight/how-to-use-hreflang-for-international-seo/ Over to you and your technical colleagues, I think ….
Intermediate & Advanced SEO | | JordanBrown0 -
Google WMT Showing Duplicate Content, But There is None
In the HTML improvements section of Google Webmaster Tools, it is showing duplicate content and I have verified that the duplicate content they are listing does not exist. I actually have another duplicate content issue I am baffled by, but that it already being discussed on another thread. These are the pages they are saying have duplicate META descriptions, http://www.hanneganremodeling.com/bathroom-remodeling.html (META from bathroom remodeling page) <meta name="<a class="attribute-value">description</a>" content="<a class="attribute-value">Bathroom Remodeling Washington DC, Bathroom Renovation Washington DC, Bath Remodel, Northern Virginia,DC, VA, Washington, Fairfax, Arlington, Virginia</a>" /> http://www.hanneganremodeling.com/estimate-request.html (META From estimate page) <meta name="<a class="attribute-value">description</a>" content="<a class="attribute-value">Free estimates basement remodeling, bathroom remodeling, home additions, renovations estimates, Washington DC area</a>" /> WlO9TLh
Intermediate & Advanced SEO | | WebbyNabler0 -
Duplicate content on subdomains.
Hi Mozer's, I have a site www.xyz.com and also geo targeted sub domains www.uk.xyz.com, www.india.xyz.com and so on. All the sub domains have the content which is same as the content on the main domain that is www.xyz.com. So, I want to know how can i avoid content duplication. Many Thanks!
Intermediate & Advanced SEO | | HiteshBharucha0 -
What to do when unique content is out of the question?
SEO companies/people are always stating that unique, quality content is one of the best things for SEO... But what happens when you can't do that? I've got a movie trailer blog and of late a lot of movie agencies are now asking us to use the text description they give us along with the movie trailer. This means that some pages are going to have NO unique content. What do you do in a situation like this?
Intermediate & Advanced SEO | | RichardTaylor0 -
Managing Large Regulated or Required Duplicate Content Blocks
We work with a number of pharmaceutical sites that under FDA regulation must include an "Important Safety Information" (ISI) content block on each page of the site. In many cases this duplicate content is not only provided on a specific ISI page, it is quite often longer than what would be considered the primary content of the page. At first blush a rel=canonical tag might appear to be a solution to signal search engines that there is a specific page for the ISI content and avoid being penalized, but the pages also contain original content that should be indexed as it has user benefit beyond the information contained within the ISI. Anyone else running into this challenge with regulated duplicate boiler plate and has developed a work around for handling duplicate content at the paragraph level and not the page level? One clever suggestion was to treat it as a graphic, however for a pharma site this would be a huge graphic.
Intermediate & Advanced SEO | | BlooFusion380