Duplicate user reviews from hotel based database?
-
Hello, Just got a new client who has a hotel comparison site, the problem is the reviews and the hotel data is all pulled in from a database, which is shared and used by other website owners. This obviously brings up the issue for duplicate content and panda.
I read this post by Dr Pete: http://www.seomoz.org/blog/fat-pandas-and-thin-content
but am unsure what steps to take. Any feedback would be much appreciated. Its about 200,000 pages.
Thanks
Shehzad
-
Hi Shehzad,
This is really tricky, as there can be legitimate reasons to have duplicate content. Your example of Hotel reviews is a good one: Those reviews can be useful to the end user (and help conversions) whether they are unique or not. However, as we all know, Google really isn't a fan of duplicate content.
A lot of people would scream "dupe" and tell you to instantly remove it all. Generally that isn't bad advice, but it's worth some thought first:
I think that the first call is to decide how useful they are if you completely ignore search. Knowing that they are not going to help with rankings do they still warrant a place on that page? If you think (or better still; have tested and know) that they contribute to the business in a meaningful way outside of search then you may well want to keep them. This should be fairly easy to split test I would imagine - you can look for affect on conversions, likelihood that they will return, avg commissions etc.
If you imagine that the duplicate content might actually negatively effect your rankings (we'll come to that), is it now worth keeping it?
If you think that the answer is yes then you need to ensure that there is enough that is unique on your pages that they deserve to rank with or without that duplicate content. Plenty of good sites do contain duplicate content and they don't always have more authority than others using the same. However they will sandwich it between useful new content that deserves to rank anyway.
Getting unique content for 200,000 pages isn't going to be easy, but can be done. However you can prioritise and start building it up working from where it is likely to have the most benefit. I'd imagine user generated content would play a very substantial part unless you have a lot of budget - so start thinking about creative ways to get the public to write for you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is this considered duplicate content?
Hi Guys, We have a blog for our e-commerce store. We have a full-time in-house writer producing content. As part of our process, we do content briefs, and as part of the brief we analyze competing pieces of content existing on the web. Most of the time, the sources are large publications (i.e HGTV, elledecor, apartmenttherapy, Housebeautiful, NY Times, etc.). The analysis is basically a summary/breakdown of the article, and is sometimes 2-3 paragraphs long for longer pieces of content. The competing content analysis is used to create an outline of our article, and incorporates most important details/facts from competing pieces, but not all. Most of our articles run 1500-3000 words. Here are the questions: NOTE: the summaries are written by us, and not copied/pasted from other websites. Would it be considered duplicate content, or bad SEO practice, if we list sources/links we used at the bottom of our blog post, with the summary from our content brief? Could this be beneficial as far as SEO? If we do this, should be nofollow the links, or use regular dofollow links? For example: For your convenience, here are some articles we found helpful, along with brief summaries: <summary>I want to use as much of the content that we have spent time on. TIA</summary>
White Hat / Black Hat SEO | | kekepeche1 -
Duplication Effects on Page Rank and Domain Authority
Hi Does page rank and domain authority page rank drop due to duplication issues on a web domain or on a web page? Thanks.
White Hat / Black Hat SEO | | SEOguy10 -
Image Optimization & Duplicate Content Issues
Hello Everyone, I have a new site that we're building which will incorporate some product thumbnail images cut and pasted from other sites and I would like some advice on how to properly manage those images on our site. Here's one sample scenario from the new website: We're building furniture and the client has the option of selecting 50 plastic laminate finish options from the Formica company. We'll cut and paste those 50 thumbnails of the various plastic laminate finishes and incorporate them into our site. Rather than sending our website visitors over to the Formica site, we want them to stay put on our site, and select the finishes from our pages. The borrowed thumbnail images will not represent the majority of the site's content and we have plenty of our own images and original content. As it does not make sense for us to order 50 samples from Formica & photograph them ourselves, what is the best way to handle to issue? Thanks in advance, Scott
White Hat / Black Hat SEO | | ccbamatx0 -
Would it be a good idea to duplicate a website?
Hello, here is the situation: let's say we have a website www.company1.com which is 1 of 3 main online stores catering to a specific market. In an attempt to capture a larger market share, we are considering opening a second website, say www.company2.com. Both these websites have a different URL, but offer the same products for sale to the same clientele. With this second website, the theory is instead of operating 1 of 3 stores, we now operate 2 of 4. We see 2 ways of doing this: we launch www.company2.com as a copy of www.company1.com. we launch www.company2.com as a completely different website. The problem I see with either of these approaches is duplicate content. I think the duplicate content issue would be even more or a problem with the first approach where the entire site is mostly a duplicate. With the second approach, I think the duplicate content issue can be worked around by having completely different product pages and overall website structure. Do you think either of these approaches could result in penalties by the search engines? Furthermore, we all know that higher ranking/increased traffic can be achieved though high quality unique content, social media presence, on-going link-building and so on. Now assuming we have a fixed amount of manpower to provide for these tasks; do you think we have better odds of increasing our overall traffic by sharing the manpower on 2 websites, or putting it all behind a single one? Thanks for your help!
White Hat / Black Hat SEO | | yacpro130 -
Keyword Duplication in the title
Hello, I read on this great SEO Blueprint Article here that you don't want to duplicate any words in the title tag, even one duplicate. But what if your branding and keywords both have the same word in it. For example, making the title here like this: NLP Training and Certification Center | NLP and Coaching Institute which is 66 characters by the way. Your thoughts on the duplicate word "NLP"?
White Hat / Black Hat SEO | | BobGW0 -
Competitor owns two domains which are essentially duplicates. Is this allowed?
Hello everyone,One of my competitors has two E-commerce sites that are almost exactly the same. The company re-branded a few years ago (changed the company name, changed the domain name) but kept the first domain live which is still fairly successful. Their re-branded website is a Top 1000 retailer.The thing is, both websites are essentially the EXACT SAME. They have the same products (with the same item #'s), the same pricing, the same copy and product descriptions, the same contact info, same layout, etc. The internal search bar on the first domain even redirects to their current site! The only real difference are the brand names. Currently, both sites are ranking very well for some very competitive keywords. For the past two years, I kept waiting for Google to penalize one (or both) of them for duplication. But for some reason Google seems to have not noticed. **Is there any way to "show google" site duplication they might be missing?**Thanks!
White Hat / Black Hat SEO | | bpharris90141 -
Is it a duplicate content ?
Hi Please check this link : http : // www . speedguide . net/news/yahoo-acquires-email-management-app-xobni-5252 it's a post where the admin just write the first 200-300 words and then insert the "read more here" which links to the original post This make the website active as the admin always add new content but is this not against google rules as it's a duplicate content ?? Can you tell me the name of this strategy ? Is this really work to make the website active ??
White Hat / Black Hat SEO | | loumi0 -
Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
Hi All, In relation to this thread http://www.seomoz.org/q/what-happend-to-my-ranks-began-dec-22-detailed-info-inside I'm still getting whipped hard from Google, this week for some reason all rankings have gone for the past few days. What I was wondering though is this, when Google says- Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations? I assume my site hits the nail on the head- [removed links at request of author] As you can see I target LG Optimus 3D Sim Free, LG Optimus 3D Contract and LG Optimus 3D Deals. Based on what Google has said, I know think there needs to be 1 page that covers it all instead of 3. What I'm wondering is the best way to deal with the situation? I think it should be something like this but please correct me along the way 🙂 1. Pick the strongest page out of the 3 2. Merge the content from the 2 weaker pages into the strongest 3. Update the title/meta info of the strongest page to include the KW variations of all 3 eg- LG Optimus 3D Contract Deals And Sim Free Pricing 4. Then scatter contract, deals and sim free throughout the text naturally 5. Then delete the weaker 2 pages and 301 redirect to the strongest page 6. Submit URL removal via webmastertools for the 2 weaker pages What would you do to correct this situation? Am I on the right track?
White Hat / Black Hat SEO | | mwoody0