Duplicate content for hotel websites - the usual nightmare? is there any solution other than producing unique content?
-
Hiya Mozzers I often work for hotels. A common scenario is the hotel / resort has worked with their Property Management System to distribute their booking availability around the web... to third party booking sites - with the inventory goes duplicate page descriptions sent to these "partner" websites.
I was just checking duplication on a room description - 20 loads of duplicate descriptions for that page alone - there are 200 rooms - so I'm probably looking at 4,000 loads of duplicate content that need rewriting to prevent duplicate content penalties, which will cost a huge amount of money.
Is there any other solution? Perhaps ask booking sites to block relevant pages from search engines?
-
Hi Kurt - very true - they should be taking the time for sure. I think part of problem is legacy of duplicate content - glad I'm not in their shoes!
Yup - rewriting is what I'm doing for those guys - inc new ideas for engaging content. Will let you know how it goes - an interesting project for me as never worked with a directory before!
-
Happy to help.
You may actually want to recommend to the brokers that they take the time to create original content. It's in their best interest since I assume they get paid for booking rooms/properties and they'd probably book more if they got more traffic by having original content.
In regards to that directory site, it's likely Google just decided they weren't the version of the content they wanted to display. If everything else is fine with that site, I'd bet just rewriting the pages to have original content (not just spun) would change their situation dramatically.
-
Thanks for your wise feedback EGOL - appreciated.
-
Hi Kurt and thanks for your great feedback there - funnily enough have just been writing unique content for these TPIs this week - so they have something different to work if they don't want to grapple with duplicate content issues - I've noticed the clever guys are now employing their own copywriters to produce unique content, yet many do not.
Just been looking at stats for a certain directory site and they've progressively lost traffic since panda struck - there's absolutely nothing wrong with their website (just completed site audit) beyond heavy duplication issues (as they've been copying and pasting property descriptions through to own site).
-
This is exactly the kind of situation where rel=canonical is supposed to be used. Rarely is there going to be 100% exact match because in most cases the use of the duplicated content is on different sites which have different headers, footers, nav menus, etc.
Put the canonical tag on your own site and then ask the booking sites if they would put them on their pages, indicating that your page is the canonical page. If they won't, then publish your page a week or so before you give out the content to the booking sites, making sure to use the canonical tag on your own site. That way, Google can find it first.
Another option would be to write unique content for your own site and then send out something different to all the booking sites. Yes, they will all have duplicate content, but your site won't. So, you should rank just fine and they will have to compete to see who can get in the listings.
Keep in mind that there isn't really a duplicate content penalty. When Google sees duplicates, they just don't include all of the duplicates in their search results. They choose the one they think it the canonical version and the others are left out. Not every page gets listed, but no site is penalized either.
Kurt Steinbrueck
OurChurch.Com -
I agree with EGOL and was going to suggest the same thing rel=canonical
-
It is supposed to be used on exact match duplicates. However, I know that it works on less than exact match. How far it can be stretched, I have no idea.
-
Can you use rel=canonical effectively if the duplication of a page is extensive yet only partial? in this instance I'm sometimes seeing say 3 paragraph room descriptions - e.g. 1st para carbon copy, yet para 2 and 3 include duplicate content and some new content.
-
rel=canonical (if you started with original content and can get everyone everywhere to use it and none of it gets stolen)
-
Hi luke,
I guess using the noindex parameter would be the best option here, no?
Best reagrds,
Michel
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content Strategy/Duplicate Content Issue, rel=canonical question
Hi Mozzers: We have a client who regularly pays to have high-quality content produced for their company blog. When I say 'high quality' I mean 1000 - 2000 word posts written to a technical audience by a lawyer. We recently found out that, prior to the content going on their blog, they're shipping it off to two syndication sites, both of which slap rel=canonical on them. By the time the content makes it to the blog, it has probably appeared in two other places. What are some thoughts about how 'awful' a practice this is? Of course, I'm arguing to them that the ranking of the content on their blog is bound to be suffering and that, at least, they should post to their own site first and, if at all, only post to other sites several weeks out. Does anyone have deeper thinking about this?
Intermediate & Advanced SEO | | Daaveey0 -
Phasing in new website - new content on www2
Hi Mozzers, I'm working on a large website redesign / redevelopment project. New sections of the website will be phased in over the next 12 months. The plan is to launch all new content on a subdomain (www2.domain.com) while the old site remains on www.domain.com. There will be no duplicate content across the www and www2 sites, as old content will be removed on www as it is replaced with new content on www2. 301 redirects will also be setup from old content on www to new content on www2. Once the new site on www2 is complete, everything will be moved to www, with a robust 301 redirect setup in place. Is this approach logical, and can you see any SEO implication for managing the migration in this way? Thanks!
Intermediate & Advanced SEO | | RWesley0 -
Identifying Duplicate Content
Hi looking for tools (beside Copyscape or Grammarly) which can scan a list of URLs (e.g. 100 pages) and find duplicate content quite quickly. Specifically, small batches of duplicate content, see attached image as an example. Does anyone have any suggestions? Cheers. 5v591k.jpg
Intermediate & Advanced SEO | | jayoliverwright0 -
Trailing Slashes for Magento CMS pages - 2 URLS - Duplicate content
Hello, Can anyone help me find a solution to Fixing and Creating Magento CMS pages to only use one URL and not two URLS? www.domain.com/testpage www.domain.com/testpage/ I found a previous article that applies to my issue, which is using htaccess to redirect request for pages in magento 301 redirect to slash URL from the non-slash URL. I dont understand the syntax fully in htaccess , but I used this code below. This code below fixed the CMS page redirection but caused issues on other pages, like all my categories and products with this error: "This webpage has a redirect loop ERR_TOO_MANY_REDIRECTS" Assuming you're running at domain root. Change to working directory if needed. RewriteBase / # www check If you're running in a subdirectory, then you'll need to add that in to the redirected url (http://www.mydomain.com/subdirectory/$1 RewriteCond %{HTTP_HOST} !^www. [NC]
Intermediate & Advanced SEO | | iamgreenminded
RewriteRule ^(.*)$ http://www.mydomain.com/$1 [R=301,L] Trailing slash check Don't fix direct file links RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_URI} !(.)/$
RewriteRule ^(.)$ $1/ [L,R=301] Finally, forward everything to your front-controller (index.php) RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule .* index.php [QSA,L]0 -
Duplicate content question
Hi there, I work for a Theater news site. We have an issue where our system creates a chunk of duplicate content in Google's eyes and we're not sure how best to solve. When an editor produces a video, it simultaneously 1) creates a page with it's own static URL (e.g. http://www.theatermania.com/video/mary-louise-parker-tommy-tune-laura-osnes-and-more_668.html); and 2) displays said video on a public index page (http://www.theatermania.com/videos/). Since the content is very similar, Google sees them as duplicate. What should we do about this? We were thinking that one solution would to be dynamically canonicalize the index page to the static page whenever a new video is posted, but would Google frown on this? Alternatively, should we simply nofollow the index page? Lastly, are there any solutions we may have missed entirely?
Intermediate & Advanced SEO | | TheaterMania0 -
Real Estate MLS listings - Does Google Consider duplicate content?
I have a real estate website. The site has all residential properties for sale in a certain State (MLS property listings). These properties also appear on 100's of other real estate sites, as the data is pulled from a central place where all Realtors share their listings. Question: will having these MLS listings indexed and followed by Google increase the ratio of duplicate vs original content on my website and thus negatively affect ranking for various keywords? If so, should I set the specific property pages as "no index, no follow" so my website will appear to have less duplicate content?
Intermediate & Advanced SEO | | khi50 -
Is Sitemap Issue Causing Duplicate Content & Unindexed Pages on Google?
On July 10th my site was migrated from Drupal to Google. The site contains approximately 400 pages. 301 permanent redirects were used. The site contains maybe 50 pages of new content. Many of the new pages have not been indexed and many pages show as duplicate content. Is it possible that there is a site map issue that is causing this problem? My developer believes the map is formatted correctly, but I am not convinced. The sitemap address is http://www.nyc-officespace-leader.com/page-sitemap.xml [^] I am completely non technical so if anyone could take a brief look I would appreciate it immensely. Thanks,
Intermediate & Advanced SEO | | Kingalan1
Alan | |0 -
Duplicate Content Question
Hey Everyone, I have a question regarding duplicate content. If your site is penalized for duplicate content, is it just the pages with the content on it that are affected or is the whole site affected? Thanks 🙂
Intermediate & Advanced SEO | | jhinchcliffe0