Duplicate content for hotel websites - the usual nightmare? is there any solution other than producing unique content?
-
Hiya Mozzers I often work for hotels. A common scenario is the hotel / resort has worked with their Property Management System to distribute their booking availability around the web... to third party booking sites - with the inventory goes duplicate page descriptions sent to these "partner" websites.
I was just checking duplication on a room description - 20 loads of duplicate descriptions for that page alone - there are 200 rooms - so I'm probably looking at 4,000 loads of duplicate content that need rewriting to prevent duplicate content penalties, which will cost a huge amount of money.
Is there any other solution? Perhaps ask booking sites to block relevant pages from search engines?
-
Hi Kurt - very true - they should be taking the time for sure. I think part of problem is legacy of duplicate content - glad I'm not in their shoes!
Yup - rewriting is what I'm doing for those guys - inc new ideas for engaging content. Will let you know how it goes - an interesting project for me as never worked with a directory before!
-
Happy to help.
You may actually want to recommend to the brokers that they take the time to create original content. It's in their best interest since I assume they get paid for booking rooms/properties and they'd probably book more if they got more traffic by having original content.
In regards to that directory site, it's likely Google just decided they weren't the version of the content they wanted to display. If everything else is fine with that site, I'd bet just rewriting the pages to have original content (not just spun) would change their situation dramatically.
-
Thanks for your wise feedback EGOL - appreciated.
-
Hi Kurt and thanks for your great feedback there - funnily enough have just been writing unique content for these TPIs this week - so they have something different to work if they don't want to grapple with duplicate content issues - I've noticed the clever guys are now employing their own copywriters to produce unique content, yet many do not.
Just been looking at stats for a certain directory site and they've progressively lost traffic since panda struck - there's absolutely nothing wrong with their website (just completed site audit) beyond heavy duplication issues (as they've been copying and pasting property descriptions through to own site).
-
This is exactly the kind of situation where rel=canonical is supposed to be used. Rarely is there going to be 100% exact match because in most cases the use of the duplicated content is on different sites which have different headers, footers, nav menus, etc.
Put the canonical tag on your own site and then ask the booking sites if they would put them on their pages, indicating that your page is the canonical page. If they won't, then publish your page a week or so before you give out the content to the booking sites, making sure to use the canonical tag on your own site. That way, Google can find it first.
Another option would be to write unique content for your own site and then send out something different to all the booking sites. Yes, they will all have duplicate content, but your site won't. So, you should rank just fine and they will have to compete to see who can get in the listings.
Keep in mind that there isn't really a duplicate content penalty. When Google sees duplicates, they just don't include all of the duplicates in their search results. They choose the one they think it the canonical version and the others are left out. Not every page gets listed, but no site is penalized either.
Kurt Steinbrueck
OurChurch.Com -
I agree with EGOL and was going to suggest the same thing rel=canonical
-
It is supposed to be used on exact match duplicates. However, I know that it works on less than exact match. How far it can be stretched, I have no idea.
-
Can you use rel=canonical effectively if the duplication of a page is extensive yet only partial? in this instance I'm sometimes seeing say 3 paragraph room descriptions - e.g. 1st para carbon copy, yet para 2 and 3 include duplicate content and some new content.
-
rel=canonical (if you started with original content and can get everyone everywhere to use it and none of it gets stolen)
-
Hi luke,
I guess using the noindex parameter would be the best option here, no?
Best reagrds,
Michel
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Posting same content to different high authority websites
Let's say 1 article piece is highly relevant to multiple states and we pitch this article across the different domains in those states. Each article piece will be tweaked to localize the content. I understand that Google devalues links coming from low quality, websites that are spun up, but what about links that are basically the same content (but localized), across different high authority domains?
Intermediate & Advanced SEO | | imjonny1230 -
Semi-duplicate content yet authoritative site
So I have 5 real estate sites. One of those sites is of course the original, and it has more/better content on most of the pages than the other sites. I used to be top ranked for all of the subdivsion names in my town. Then when I did the next 2-4 sites, I had some sites doing better than others for certain keywords, and then I have 3 of those sites that are basically the same URL structures (besides the actual domain) and they aren't getting fed very many visits. I have a couple of agents that work with me that I loaned my sites to to see if that would help since it would be a different name. My same youtube video is on each of the respective subdivision pages of my site and theirs. Also, their content is just rewritten content from mine about the same length of content. I have looked over and seen a few of my competitors who only have one site and their URL structures arent good at all, and their content isn't good at all and a good bit of their pages rank higher than my main site which is very frustrating to say the least since they are actually copy cats to my site. I sort of started the precedent of content, mapping the neighborhood, how far that subdivision is from certain landmarks, and then shot a video of each. They have pretty much done the same thing and are now ahead of me. What sort of advice could you give me? Right now, I have two sites that are almost duplicate in terms of a template and same subdivsions although I did change the content the best I could, and that site is still getting pretty good visits. I originally did it to try and dominate the first page of the SERPS and then Penguin and Panda came out and seemed to figure that game out. So now, I would still like to keep all the sites, but I'm assuming that would entail making them all unique, which seems to be tough seeing as though my town has the same subdivisions. Curious as to what the suggestions would be, as I have put a lot of time into these sites. If I post my site will it show up in the SERPS? Thanks in advance
Intermediate & Advanced SEO | | Veebs0 -
Pages with Duplicate Page Content (with and without www)
How can we resolve pages with duplicate page content? With and without www?
Intermediate & Advanced SEO | | directiq
Thanks in advance.0 -
Bespoke Website With Lack of Front Page Content
Hey guys, I wanted to ask you your opinion.. If you had a website - portfolio style for argument's sake and it was based on wordpress, obviously the front page won't be SEO friendly if you want to keep the minimalistic approach - there will be hardly any content to tell google what to rank your site for... So my question is, can you use a plugin that Google can 'see' content - such as a long unique article - that the user can't see in order to help you rank? I.e. for Gbot, the plugin would load the content plugin as plain html, but 'hide' it from most people visiting the site... What would you do in this scenario? Your response would be much appreciated! Thanks in advance for your help!
Intermediate & Advanced SEO | | geniusenergyltd0 -
Product descriptions & Duplicate Content: between fears and reality
Hello everybody, I've been reading quite a lot recently about this topic and I would like to have your opinion about the following conclusion: ecommerce websites should have their own product descriptions if they can manage it (it will be beneficial for their SERPs rankings) but the ones who cannot won't be penalized by having the same product descriptions (or part of the same descriptions) IF it is only a "small" part of their content (user reviews, similar products, etc). What I mean is that among the signals that Google uses to guess which sites should be penalized or not, there is the ratio "quantity of duplicate content VS quantity of content in the page" : having 5-10 % of a page text corresponding to duplicate content might not be harmed while a page which has 50-75 % of a content page duplicated from an other site... what do you think? Can the "internal" duplicated content (for example 3 pages about the same product which is having 3 diferent colors -> 1 page per product color) be considered as "bad" as the "external" duplicated content (same product description on diferent sites) ? Thanks in advance for your opinions!
Intermediate & Advanced SEO | | Kuantokusta0 -
Proper Hosting Setup to Avoid Subfolders & Duplicate Content
I've noticed with hosting multiple websites on a single account you end up having your main site in the root public_html folder, but when you create subfolders for new website it actually creates a duplicate website: eg. http://kohnmeat.com/ is being hosted on laubeau.com's server. So you end up with a duplicate website: http://laubeau.com/kohn/ Anyone know the best way to prevent this from happening? (i.e. canonical? 301? robots.txt?) Also, maybe a specific 'how-to' if you're feeling generous 🙂
Intermediate & Advanced SEO | | ATMOSMarketing560 -
Fixing Duplicate Content Errors
SEOMOZ Pro is showing some duplicate content errors and wondered the best way to fix them other than re-writing the content. Should I just remove the pages found or should I set up permanent re-directs through to the home page in case there is any link value or visitors on these duplicate pages? Thanks.
Intermediate & Advanced SEO | | benners0 -
How to deal with category browsing and duplicate content
On an ecommerce site there are typically a lot of pages that may appear to be duplications due to category browse results where the only difference may be the sorting by price or number of products per page. How best to deal with this? Add nofollow to the sorting links? Set canonical values that ignore these variables? Set cononical values that match the category home page? Is this even a possible problem with Panda or spiders in general?
Intermediate & Advanced SEO | | IanTheScot0