Duplicate Content Question
-
Currently, we manage a site that generates content from a database based on user search criteria such as location or type of business. ..Although we currently rank well -- we created the website based on providing value to the visitor with options for viewing the content - we are concerned about duplicate content issues and if they would apply.
For example, the listing that is pulled up for the user upon one search could have the same content as another search but in a different order. Similar to hotels who offer room booking by room type or by rate.
Would this dynamically generated content count as duplicate content?
The site has done well, but don't want to risk a any future Google penalties caused by duplicate content. Thanks for your help!
-
Thank you for your provided example, that's exactly what I meant.
You have the following "default" display:
http://www.neworleansrestaurants.com/restaurants/
and the following one which is a "variant" of the first one:
http://www.neworleansrestaurants.com/restaurants/?loc=all
You are actually showing the "same" listings ordered differently... so, a rel=canonical in my opinion will put you safe:
-
I can't give you the specific example of the site because it's undergoing redesign
However, we have a similar issue on a sister site. It has 2 separate pages with the same listings but different categories:
By location: http://www.neworleansrestaurants.com/restaurants/?loc=all
By type of restaurant http://www.neworleansrestaurants.com/restaurants/
Thanks for the feed and information Fabrizo.
-
I don't understand why the content of those 2 pages are the same if they show different categories... are same listings ordered differently? Can we have a look at those pages?
-
On our site only difference is that different pages show up different results. I.e., the page with results A will have a title tag and content related to page A. The page results for page B will also have a unique page with a unique title tag. In that case, the listings are the same.. but they appear on two pages, each with a unique category that should have its own page. In this case, the categories are “location” and “type.”
-
I would need to have a look at your website to understand how it is structured, but I have a very similar case on my site virtualsheetmusic.com, and I think it is a common case for e-commerce websites in general as well, and I think the best way to avoid any issues is to use a rel=canonical tag.
For example, if your page URL for a search can vary in the following way:
http://www.yoursite.com/search.php [assuming this is the "default" page display]
http://www.yoursite.com/search.php?sort=title
http://www.yoursite.com/search.php?sort=title&filter=NY
I would put a rel-canonical like:
pointing to the "default" version of the page. That would avoid any duplicate issues very easily!
Also, if you have paginated content (2 or more pages results) you may want to add the rel=prev and rel=next definitions as suggested by Google:
http://googlewebmastercentral.blogspot.com/2011/09/pagination-with-relnext-and-relprev.html
I hope this helps.
-
Hi, if you're trying to make your website better for the end user you almost can't lose. Google wants what the end-user wants fast page loads times relevant content and easy navigation to name a few of the things that are important to Google & visitors. You'll find that if you match the two you will almost always get it right.
I hope this is been of help sincerely,
Thomas
-
Thanks for the feedback Thomas. I should note that this situation is all on one website.
-
I believe the easiest way to answer this. Is if you have website A & B. Well I get the exact same answer if I query whatever the keyword "example" is? From both websites? If so I always get the same answer?
If the answer to that is yes I will get the same answer to make this query the same.
From each website then I would say will have trouble with the content however if the answer is no I would say you want to examine further for how much of is identical.
I'm not fan of having identical content especially when you control it. if it is the same result. Then yes you'll get to content issues with Google and I would not recommend creating an additional website to order content from the same database because it sounds to me like you will be getting identical answers for queries is this correct?
Do understand how your gathering content from the database so it would have to be identical right? If this is the case then I would not create additional website I would created this website if you need to do different subject but if you have one already just focus on creating a better version of that website.
I hope this is of help,
Thomas
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is This Considered Duplicate Content?
My site has entered SEO hell and I am not sure how to fix it. Up until 18 months ago I had tremendous success on Google and Bing and now my website appears below my Facebook page for the term "Direct Mail Raleigh." What makes it even more frustrating is my competitors have done no SEO and they are dominating this keyword. I thought that the issue was due to harmful inbound links and two months ago I disavowed ones that were clearly spam. Somehow my site has actually gone down! I have a blog that I have updated infrequently and I do not know if it I am getting punished for duplicate content. On Google Webmaster Tools it says I have 279 crawled and indexed pages. Yesterday when I ran the MOZ crawl check I was amazed to find 1150 different webpages on my site. Despite the fact that it does not appear on the webmaster tools I have three different webpages due to the format that the Wordpress blog was created: "http://www.marketplace-solutions.com/report/part2leadershi/", "http://www.marketplace-solutions.com/report/page/91/" and "http://www.marketplace-solutions.com/report/category/competent-leadership/page/3/" What does not make sense to me is why Google only indexed 279 webpages AND why MOZ did not identify these three webpages as duplicate content with the Crawl Test Tool. Does anyone have any ideas? Would it be as easy as creating a massive robot.txt file and just putting 2 of the 3 URLs in that file? Thank you for your help.
Intermediate & Advanced SEO | | DR700950 -
Content question about 3 sites targeted at 3 different countries
I am new here, and this is my first question. I was hoping to get help with the following scenario: I am looking to launch 3 sites in 3 different countries, using 3 different domains. For example the.com for USA, the .co.uk for UK , and a slightly different .com for Australia, as I could not purchase .com.au as I am not a registered business in Australia. I am looking to set the Geographic Target on Google Webmaster. So for example, I have set the .com for USA only, with .co.uk I won't need to set anything, and I will set the other Australian .com to Australia. Now, initially the 3 site will be "brochure" websites explaining the service that we offer. I fear that at the beginning they will most likely have almost identical content. However, on the long term I am looking to publish unique content for each site, almost on a weekly basis. So over time they would have different content from each other. These are small sites to begin with. So each site in the "brochure" form will have around 10 pages. Over time it will have 100's of pages. My question or my worry is, will Google look at the fact that I have same content across 3 sites negatively even though they are specifically targeted to different countries? Will it penalise my sites negatively?
Intermediate & Advanced SEO | | ryanetc0 -
Same content pages in different versions of Google - is it duplicate>
Here's my issue I have the same page twice for content but on different url for the country, for example: www.example.com/gb/page/ and www.example.com/us/page So one for USA and one for Great Britain. Or it could be a subdomain gb. or us. etc. Now is it duplicate content is US version indexes the page and UK indexes other page (same content different url), the UK search engine will only see the UK page and the US the us page, different urls but same content. Is this bad for the panda update? or does this get away with it? People suggest it is ok and good for localised search for an international website - im not so sure. Really appreciate advice.
Intermediate & Advanced SEO | | pauledwards0 -
Magento Duplicate Content Recovery
Hi, we switched platforms to Magento last year. Since then our SERPS rankings have declined considerably (no sudden drop on any Panda/Penguin date lines). After investigating, it appeared we neglected to No index, follow all our filter pages and our total indexed pages rose sevenfold in a matter of weeks. We have since fixed the no index issue and the pages indexed are now below what we had pre switch to Magento. We've seen some positive results in the last week. Any ideas when/if our rankings will return? Thanks!
Intermediate & Advanced SEO | | Jonnygeeuk0 -
Is an RSS feed considered duplicate content?
I have a large client with satellite sites. The large site produces many news articles and they want to put an RSS feed on the satellite sites that will display the articles from the large site. My question is, will the rss feeds on the satellite sites be considered duplicate content? If yes, do you have a suggestion to utilize the data from the large site without being penalized? If no, do you have suggestions on what tags should be used on the satellite pages? EX: wrapped in tags? THANKS for the help. Darlene
Intermediate & Advanced SEO | | gXeSEO0 -
Should I redirect all my subdomains to a single unique subdomain to eliminate duplicate content?
Hi there! I've been working on http://duproprio.com for a couple of years now. In the early stages of the website, we've put into place a subdomain wildcard, that allowed us to create urls like this on the fly : http://{some-city}.duproprio.com This brought us instantly a lot of success in terms of traffic due to the cities being great search keywords. But now, business has grown, and as we all know, duplicate content is the devil so I've been playing with the idea of killing (redirecting) all those urls to their equivalent on the root domain. http://some-city.duproprio.com/some-listing-1234 would redirect to equivalent page at : http://duproprio.com/some-listing-1234 Even if my redirections are 301 permanent, there will be some juice lost for each link redirected that are actually pointing to my old subdomains This would also imply to redirect http://www.duproprio.com to http://duproprio.com. Which is probably the part I'm most anxious about since the incoming links are almost 50/50 between those 2 subdomains... Bringing everything back into a single subdomain is the thing to do in order to get all my seo juice together, this part is obvious... But what can I do to make sure that I don't end up actually losing traffic instead of gaining authority? Can you help me get the confidence I need to make this "move" without risking to lose tons of traffic? Thanks a big lot!
Intermediate & Advanced SEO | | DuProprio.com0 -
How get rid of duplicate content, titles, etc on php cartweaver site?
my website http://www.bartramgallery.com was created using php and cartweaver 2.0 about five years ago by a web developer. I was really happy with the results of the design was inspired to get into web development and have been studying ever since. My biggest problem at this time is that I am not knowledgable with php and the cartweaver product but am learning as I read more. The issue is that seomoz tools are reporting tons of duplicate content and duplicate title pages etc. This is likely from the dynamic urls and same pages with secondary results etc. I just made a new sitemap with auditmypc I think it was called in an attempt to get rid of all the duplicate page titles but is that going to solve anything or do I need to find another way to configure the site? There are many pages with the same content competing for page rank and it is a bit frustrating to say the least. If anyone has any advice it would be greatly appreciated even pointing me in the right direction. Thank you, Jesse
Intermediate & Advanced SEO | | WSOT0 -
Pop Up Pages Being Indexed, Seen As Duplicate Content
I offer users the opportunity to email and embed images from my website. (See this page http://www.andertoons.com/cartoon/6246/ and look under the large image for "Email to a Friend" and "Get Embed HTML" links.) But I'm seeing the ensuing pop-up pages (Ex: http://www.andertoons.com/embed/5231/?KeepThis=true&TB_iframe=true&height=370&width=700&modal=true and http://www.andertoons.com/email/6246/?KeepThis=true&TB_iframe=true&height=432&width=700&modal=true) showing up in Google. Even worse, I think they're seen as duplicate content. How should I deal with this?
Intermediate & Advanced SEO | | andertoons0