Duplicate content for hotel websites - the usual nightmare? is there any solution other than producing unique content?
-
Hiya Mozzers I often work for hotels. A common scenario is the hotel / resort has worked with their Property Management System to distribute their booking availability around the web... to third party booking sites - with the inventory goes duplicate page descriptions sent to these "partner" websites.
I was just checking duplication on a room description - 20 loads of duplicate descriptions for that page alone - there are 200 rooms - so I'm probably looking at 4,000 loads of duplicate content that need rewriting to prevent duplicate content penalties, which will cost a huge amount of money.
Is there any other solution? Perhaps ask booking sites to block relevant pages from search engines?
-
Hi Kurt - very true - they should be taking the time for sure. I think part of problem is legacy of duplicate content - glad I'm not in their shoes!
Yup - rewriting is what I'm doing for those guys - inc new ideas for engaging content. Will let you know how it goes - an interesting project for me as never worked with a directory before!
-
Happy to help.
You may actually want to recommend to the brokers that they take the time to create original content. It's in their best interest since I assume they get paid for booking rooms/properties and they'd probably book more if they got more traffic by having original content.
In regards to that directory site, it's likely Google just decided they weren't the version of the content they wanted to display. If everything else is fine with that site, I'd bet just rewriting the pages to have original content (not just spun) would change their situation dramatically.
-
Thanks for your wise feedback EGOL - appreciated.
-
Hi Kurt and thanks for your great feedback there - funnily enough have just been writing unique content for these TPIs this week - so they have something different to work if they don't want to grapple with duplicate content issues - I've noticed the clever guys are now employing their own copywriters to produce unique content, yet many do not.
Just been looking at stats for a certain directory site and they've progressively lost traffic since panda struck - there's absolutely nothing wrong with their website (just completed site audit) beyond heavy duplication issues (as they've been copying and pasting property descriptions through to own site).
-
This is exactly the kind of situation where rel=canonical is supposed to be used. Rarely is there going to be 100% exact match because in most cases the use of the duplicated content is on different sites which have different headers, footers, nav menus, etc.
Put the canonical tag on your own site and then ask the booking sites if they would put them on their pages, indicating that your page is the canonical page. If they won't, then publish your page a week or so before you give out the content to the booking sites, making sure to use the canonical tag on your own site. That way, Google can find it first.
Another option would be to write unique content for your own site and then send out something different to all the booking sites. Yes, they will all have duplicate content, but your site won't. So, you should rank just fine and they will have to compete to see who can get in the listings.
Keep in mind that there isn't really a duplicate content penalty. When Google sees duplicates, they just don't include all of the duplicates in their search results. They choose the one they think it the canonical version and the others are left out. Not every page gets listed, but no site is penalized either.
Kurt Steinbrueck
OurChurch.Com -
I agree with EGOL and was going to suggest the same thing rel=canonical
-
It is supposed to be used on exact match duplicates. However, I know that it works on less than exact match. How far it can be stretched, I have no idea.
-
Can you use rel=canonical effectively if the duplication of a page is extensive yet only partial? in this instance I'm sometimes seeing say 3 paragraph room descriptions - e.g. 1st para carbon copy, yet para 2 and 3 include duplicate content and some new content.
-
rel=canonical (if you started with original content and can get everyone everywhere to use it and none of it gets stolen)
-
Hi luke,
I guess using the noindex parameter would be the best option here, no?
Best reagrds,
Michel
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Photogallery Metadescription - Duplicate or Unique?
Hi There,Got a few question regarding SEO for Photogalleries. - What are the best practices in terms of SEO whenever we have a gallery of pictures and each picture has a different URL? Currently, we are using a unique metadescription that is on the first page of the gallery, then for the remaining ones, we are using the captions of the pictures as metadescriptions.Is the the good way to go? Is it better to have short metadescriptions (being the captions of our pictures) or duplicate metadescriptions (the same one across the pictures)?- is there any other recommendations used by other websites?Thanks!
Intermediate & Advanced SEO | | seaounousa0 -
SEO for video content that is duplicated accross a larger network
I have a website with lots of content (high quality video clips for a particular niche). All the content gets fed out 100+ other sites on various domains/subdomains which are reskinned for a given city. So the content on these other sites is 100% duplicate. I still want to generate SEO traffic though. So my thought is that we: a) need to have canonical tags from all the other domains/subdomains that point back to the original post on the main site b) probably need to disallow search engine crawlers on all the other domains/subdomains Is this on the right track? Missing anything important related to duplicate content? The idea is that after we get search engines crawling the content correctly, from there we'd use the IP address to redirect the visitor to the best suited domain/subdomain. any thoughts on that approach? Thanks for your help!
Intermediate & Advanced SEO | | PlusROI0 -
Duplicate Page Due To Website Display Function
Hi Can anyone help with how I can rectify a duplicate issue? A high priority on my Moz report shows a duplicate issue however, this is due to the way the website is structured. For example. the below duplicate is created due to the website having a function to display all trips, so customers do not need to search page by page i.e: http://www.bikecation.co.uk/categories/cycling-climbs http://www.bikecation.co.uk/categories/cycling-climbs/page/2?showall=1 My question is, Will this format damage the SEO for this page? Is there a way to rectify? Would a canonical tag work in this case? Many Thanks Claire
Intermediate & Advanced SEO | | Strateji0 -
How to remove duplicate content, which is still indexed, but not linked to anymore?
Dear community A bug in the tool, which we use to create search-engine-friendly URLs (sh404sef) changed our whole URL-structure overnight, and we only noticed after Google already indexed the page. Now, we have a massive duplicate content issue, causing a harsh drop in rankings. Webmaster Tools shows over 1,000 duplicate title tags, so I don't think, Google understands what is going on. <code>Right URL: abc.com/price/sharp-ah-l13-12000-btu.html Wrong URL: abc.com/item/sharp-l-series-ahl13-12000-btu.html (created by mistake)</code> After that, we ... Changed back all URLs to the "Right URLs" Set up a 301-redirect for all "Wrong URLs" a few days later Now, still a massive amount of pages is in the index twice. As we do not link internally to the "Wrong URLs" anymore, I am not sure, if Google will re-crawl them very soon. What can we do to solve this issue and tell Google, that all the "Wrong URLs" now redirect to the "Right URLs"? Best, David
Intermediate & Advanced SEO | | rmvw0 -
Http and https duplicate content?
Hello, This is a quick one or two. 🙂 If I have a page accessible on http and https count as duplicate content? What about external links pointing to my website to the http or https page. Regards, Cornel
Intermediate & Advanced SEO | | Cornel_Ilea0 -
Duplicating an existing website - new name and reskin
Would re-skinning, duplicating an exising ecommerce website with a new domain name cause any ranking issues? The plan would be that all product data, pricing info etc would be identical, the site would have a minor redesign to change colours, logos etc and all duplicate content would be rel=canonicaled to the original site. In case you are wondering the reason for this is a customer with an existing site wants to try out a new brand without incorporating a massive development costs. The majority of traffic would be driving through google shopping, a bit of PPC, social etc. Is this site duplication likely to harm the original site or will setting up rel=canonical to point to the original site going to be sufficient enough to prevent this happening? Is there anything else is should consider? Many thanks for your help
Intermediate & Advanced SEO | | JustinTaylor880 -
Duplicate Content on Wordpress b/c of Pagination
On my recent crawl, there were a great many duplicate content penalties. The site is http://dailyfantasybaseball.org. The issue is: There's only one post per page. Therefore, because of wordpress's (or genesis's) pagination, a page gets created for every post, thereby leaving basically every piece of content i write as a duplicate. I feel like the engines should be smart enough to figure out what's going on, but if not, I will get hammered. What should I do moving forward? Thanks!
Intermediate & Advanced SEO | | Byron_W0 -
Duplicate Content on Press Release?
Hi, We recently held a charity night in store. And had a few local celebs turn up etc... We created a press release to send out to various media outlets, within the press release were hyperlinks to our site and links on certain keywords to specific brands on our site. My question is, should we be sending a different press release to each outlet to stop the duplicate content thing, or is sending the same release out to everyone ok? We will be sending approx 20 of these out, some going online and some not. So far had one local paper website, a massive football website and a local magazine site. All pretty much same content and a few pics. Any help, hints or tips on how to go about this if I am going to be sending out to a load of other sites/blogs? Cheers
Intermediate & Advanced SEO | | YNWA0