What is considered duplicate content?
-
Hi,
We are working on a product page for bespoke camper vans: http://www.broadlane.co.uk/campervans/vw-campers/bespoke-campers . At the moment there is only one page but we are planning add similar pages for other brands of camper vans. Each page will receive its specifically targeted content however the 'Model choice' cart at the bottom (giving you the choice to select the internal structure of the van) will remain the same across all pages.
Will this be considered as duplicate content? And if this is a case, what would be the ideal solution to limit penalty risk: A rel canonical tag seems wrong for this, as there is no original item as such. Would an iFrame around the 'model choice' enable us to isolate the content from being indexed at the same time than the page?
Thanks,
Celine
-
Hi Celine,
Google is very smart at finding content these days, so I would avoid any possible ways of trying to hide it, but looking at what is there, I wouldn't worry too much.
When looking at the model choice at the bottom of the page, it is the same for a reason, and there is no way around it. However, I wouldn't think that Google would see that as duplicate content. Lists in this manner don't normally cause issues and as mentioned above, it is more often larger 'chunks' of content that causes issues.
There are other considerations that you might want to think about before releasing a lot more pages in this manner, and one if them is making sure Google won't see the pages appearing for no other reason that to draw in search traffic for particular phrases.
Keep the pages well stocked with unique relevant content and you should be good to go.
-Andy
-
A quick way to figure this out.
Copy a entire paragraph from the content in question. Paste the whole paragraph into Google search.
any close matches? Then It is duplicate.
Thanks,
- Mike Bean
-
This type of duplicate content is common on ecommerce websites, and it isn't necessarily a big problem. However, given the fact that there will be a higher percentage of duplicate content than unique content, you run the risk of some of your pages being omitted from search results for certain queries. If that is the case, searchers will see "In order to show you the most relevant results, we have omitted some entries very similar to the (# here)already displayed. If you like, you can repeat the search with the omitted results included."
This isn't really a penalty. It's just Google being efficient with their algorithm. It shouldn't be a problem for highly targeted searches, but you may lose a little search visibility for more generic searches.
My advice is to get creative and find new ways to add more unique content to your product pages. Add testimonials, user-generated reviews, camper van adventure stories, etc.
You are right that canonical tags are wrong for this situation. Using an iframe doesn't make much sense either. Google has stated that they try to associate iframe content with the page it's embedded on anyway.
Further information:
-
Hi there,
If the higher percentage of content on each page is different from any other you should be OK. However I'd be worried about producing a whole bunch of pages like this. -
According to Google: "Duplicate content generally refers to substantive blocks of content within or across domains that either completely match other content or are appreciably similar."
The example you give is a bit in between: it contains a part which is unique - however the tabulated content which would appear on each page contains more content than the unique part. Personally I don't think that these pages would be considered duplicate content. However, if you want to be on the save side you could make a separate page with all possible configurations. This would also have the advantage that you could do without the tab's (end of of 2014 John Muller indicated that hiding content under tabs is not the best seo strategy (https://www.seroundtable.com/google-hidden-tab-content-seo-19489.html).
I wouldn't go for the iframe solution - it's a bit an outdated way to present information.
Hope this helps,
Dirk
-
Hi Celine
Good news, as you haven't made all the pages yet, now is the easiest time to implement new things! : -)
The best way I would recommend is utilising HTML Semantics http://www.w3schools.com/html/html5_semantic_elements.aspYou would have your main content inside the
and any supporting but repetitive content in <inside>tags.
Hope that helps!
King Regards
Jimmy</inside>
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How bad is duplicate content for ecommerce sites?
We have multiple eCommerce sites which not only share products across domains but also across categories within a single domain. Examples: http://www.artisancraftedhome.com/sinks-tubs/kitchen-sinks/two-tone-sinks/medium-rounded-front-farmhouse-sink-two-tone-scroll http://www.coppersinksonline.com/copper-kitchen-and-farmhouse-sinks/two-tone-kitchen-farmhouse-sinks/medium-rounded-front-farmhouse-sink-two-tone-scroll http://www.coppersinksonline.com/copper-sinks-on-sale/medium-rounded-front-farmhouse-sink-two-tone-scroll We have selected canonical links for each domain but I need to know if this practice is having a negative impact on my SEO.
Intermediate & Advanced SEO | | ArtisanCrafted0 -
Duplicate content on subdomains
Hi All, The structure of the main website goes by http://abc.com/state/city/publication - We have a partnership with public libraries to give local users access to the publication content for free. We have over 100 subdomains (each for an specific library) that have duplicate content issues with the root domain, Most subdomains have very high page authority (the main public library and other local .gov websites have links to this subdomains).Currently this subdomains are not index due to the robots text file excluding bots from crawling. I am in the process of setting canonical tags on each subdomain and open the robots text file. Should I set the canonical tag on each subdomain (homepage) to the root domain version or to the specific city within the root domain? Example 1:
Intermediate & Advanced SEO | | NewspaperArchive
Option 1: http://covina.abc.com/ = Canonical Tag = http://abc.com/us/california/covina/
Option 2: http://covina.abc.com/ = Canonical Tag = http://abc.com/ Example 2:
Option 1: http://galveston.abc.com/ = Canonical Tag = http://abc.com/us/texas/galveston/
Option 2: http://galveston.abc.com = Canonical Tag = http://abc.com/ Example 3:
Option 1: http://hutchnews.abc.com/ = Canonical Tag = http://abc.com/us/kansas/hutchinson/
Option 2: http://hutchnews.abc.com/ = Canonical Tag = http://abc.com/ I believe it makes more sense to set the canonical tag to the corresponding city (option 1), but wondering if setting the canonical tag to the root domain will pass "some link juice" to the root domain and it will be more beneficial. Thanks!0 -
Duplicate content on sites from different countries
Hi, we have a client who currently has a lot of duplicate content with their UK and US website. Both websites are geographically targeted (via google webmaster tools) to their specific location and have the appropriate local domain extension. Is having duplicate content a major issue, since they are in two different countries and geographic regions of the world? Any statement from Google about this? Regards, Bill
Intermediate & Advanced SEO | | MBASydney0 -
Is Sitemap Issue Causing Duplicate Content & Unindexed Pages on Google?
On July 10th my site was migrated from Drupal to Google. The site contains approximately 400 pages. 301 permanent redirects were used. The site contains maybe 50 pages of new content. Many of the new pages have not been indexed and many pages show as duplicate content. Is it possible that there is a site map issue that is causing this problem? My developer believes the map is formatted correctly, but I am not convinced. The sitemap address is http://www.nyc-officespace-leader.com/page-sitemap.xml [^] I am completely non technical so if anyone could take a brief look I would appreciate it immensely. Thanks,
Intermediate & Advanced SEO | | Kingalan1
Alan | |0 -
Duplicate content reported on WMT for 301 redirected content
We had to 301 redirect a large number of URL's. Not Google WMT is telling me that we are having tons of duplicate page titles. When I looked into the specific URL's I realized that Google is listing an old URL's and the 301 redirected new URL as the source of the duplicate content. I confirmed the 301 redirect by using a server header tool to check the correct implementation of the 301 redirect from the old to the new URL. Question: Why is Google Webmaster Tool reporting duplicated content for these pages?
Intermediate & Advanced SEO | | SEOAccount320 -
Blog content - what to do, and what to avoid in terms of links, when you're paying for blog content
Hi, I've just been looking at a restaurant site which is paying food writers to put food news and blogs on their website. I checked the backlink profile of the site and the various bloggers in question usually link from their blogs / company websites to the said restaurant to help promote any new blogs that appear on the restaurant site. That got me wondering about whether this might cause problems with Google. I guess they've been putting about one blog live per month for 2 years, from 12/13 bloggers who have been linking to their website. What would you advise?
Intermediate & Advanced SEO | | McTaggart0 -
Coupon Website Has Tons of Duplicate Content, How do I fix it?
Ok, so I just got done running my campaign on SEOMOZ for a client of mine who owns a Coupon Magazine company. They upload thousands of ads into their website which gives similar looking duplicate content ... like http://coupon.com/mom-pop-shop/100 and
Intermediate & Advanced SEO | | Keith-Eneix
http://coupon.com/mom-pop-shop/101. There's about 3200 duplicates right now on the website like this. The client wants the coupon pages to be indexed and followed by search engines so how would I fix the duplicate content but still maintain search-ability of these coupon landing pages?0 -
Duplicate content via dynamic URLs where difference is only parameter order?
I have a question about the order of parameters in an URL versus duplicate content issues. The URLs would be identical if the parameter order was the same. E.g.
Intermediate & Advanced SEO | | anthematic
www.example.com/page.php?color=red&size=large&gender=male versus
www.example.com/page.php?gender=male&size=large&color=red How smart is Google at consolidating these, and do these consolidated pages incur any penalty (is their combined “weight” equal to their individual selves)? Does Google really see these two pages as DISTINCT, or does it recognize that they are the same because they have the exact same parameters? Is this worth fixing in or does it have a trivial impact? If we have to fix it and can't change our CMS, should we set a preferred, canonical order for these URLs or 301 redirect from one version to the other? Thanks a million!0