Percentage of duplicate content allowable
-
Can you have ANY duplicate content on a page or will the page get penalized by Google?
For example if you used a paragraph of Wikipedia content for a definition/description of a medical term, but wrapped it in unique content is that OK or will that land you in the Google / Panda doghouse?
If some level of duplicate content is allowable, is there a general rule of thumb ratio unique-to-duplicate content?
thanks!
-
I dont believe you have aproblem if you havea bit of duplicate content, google does not penilize you for duplicate content, it just dosent award you points for it.
-
That sounds like something Google will hate by default. Your problem there is page quantity to quality and uniqueness ratio.
-
It's quite difficult to provide the exact data as Google algorithm is Google's hidden treasure. Better to keep yourself safe by creating completely unique content, Referring to your example of Wikipedia definition, you can add something like " ACCORDING TO WIKIPEDIA ..... " while copying definition or adding reference links while copying any content from other sources.
Remember that Google is not only giving importance to unique content but it should be of high quality. That means the article should be innovative like a complete new thing & well researched, so it mustn't be of 200 or less words. So Google will compare the quality of the whole article with the copied content & then it'll decide whether it's a duplicate content article or not.
-
We recently launched a large 3500 page website that auto generates a sentence after we plug in statistical data in our database.
So the only unique content is a single sentence?
Within that sentence many of the words would need to be common as well. Consider a simple site that offered the population for any given location. "The population of [California] is [13 million] people."
In the above example only 3 words are unique. Maybe your pages are a bit more elaborate but it seems to me those pages are simply not indexable. What you can do is index the main page where users can enter the location they wish to learn about, but not each possible result (i.e. California).
Either add significantly more content, or only index the main page.
-
We recently launched a large 3500 page website that auto generates a sentence after we plug in statistical data in our database. All pages are relevant to users and provide more value than other results in serps, but i think a penalty is in place that the farmer update may have detected with a sort of auto-penalty against us.
I sent in a reconsideration request last week, the whole project is on hold until we get a response. I'm expecting a generic answer from them.
We are debating on either writing more unique content for every page or entering in more statistical data to run some cool correlations. The statistical data would be 3x more beneficial to the user I feel, but unique content is what Google seeks and a safer bet just to get us indexed properly.
-
We're currently observing a crumbling empire of websites with auto-generated content. Google is somehow able to understand how substantial your content is and devalue the page and even the whole site if it does not meet their criteria. This is especially damaging for sites who have say 10% of great unique content and 90% of their pages are generated via tagging, browsable search and variable driven paragraphs of text.
Having citations is perfectly normal but I would include reference section just in case.
-
You can have some duplicate content in the manner you mentioned above. It is a natural and expected part of the internet that existing sources of information will be utilized.
There is not any magic number which says "30% duplication is ok, but 31% is not". Google's algorithms are private and constantly changing. Use good sense to guide you as to whether your page is unique and offers value to users.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content for the Home Page
Hi All, I have a Videos website which contains Videos of all types + Family safe type... The home page has sections and Videos listed. Now for SEO purpose i need to have content? this is what i read in most places. What is the kind of content i can place on a Videos website Home page? I can write about a Movie or actor but that content on Home page would that be of any use? We have a About us page etc to know who we are.. Any ideas please..
On-Page Optimization | | Nettv0 -
Duplication issue on my website
hi I have a cms website with 2000 pages.my problem is that 1. www.test.com/abc.html 2. www.test.com/abc.html?gallery?123testing it showing duplication page in me seomoz error list. It is a single page. Please suggest solution for it
On-Page Optimization | | wmsindia0 -
Duplicate meta descriptions
Hi all, I'm using Yoast's SEO plugin and when I run a On Page report card here on SEOMOZ it says there are 2 descriptions tags I've been trying to fix this but can't (I'm new!) Anyone any ideas on this? Thanks Elaine
On-Page Optimization | | elaineryan0 -
How to fix duplicate page content and page titles?
Apologies in advance if this has already been answered (it probably has) - I'm just not seeing it. Is there a guide on here for how to fix the issues brought up by the crawler - specifically, things like duplicate page content, or duplicate page titles? A lot of these seem to have been created by wordpress.org combos that I didn't anticipate - i.e., category pages, author pages, etc. The crawler brings up the problems, but I don' t know where to start to go about fixing them. Also, any guide on best SEO practices or fixing optimization problems, specifically for wordpress.org blogs, would be greatly appreciated. Thanks!
On-Page Optimization | | prospects1 -
Duplicate Product BUT Unique Content -- any issues?
We have the situation where a group of products fit into 2 different categories and also serve different purposes (to the customer). Essentially, we want to have the same product duplicated on the site, but with unique content and it would even have a slightly different product name. Some specifications would be redundant, but the core content would be different. Any issues?
On-Page Optimization | | SEOPA1 -
What is the best duplicate content checker that will check by phrase?
I create a lot of landing pages for individual keywords on my site. An issue that I've run into is I unknowingly use some common phrases repeatedly on different pages and therefore sometimes get dinged by Google. I'm basically looking for a tool that would check the content of a new page against all the other pages on my site and check it phrase by phrase. Most of the tools I've found make you put in two URLs to check against - I need it to check against hundreds.
On-Page Optimization | | davegr0 -
Duplicate Content using templates
Hi, Our web site is designed using a template, which means the header and footer is consistent across all pages. Only the body content is unique on each page. Is the google bot able to see that the header and footer content is defined by the common template? Will this have any impact in terms of duplicate content? For example, we have a two line text in the footer that summarize the services we provide. Because the same text is in the footer of all pages, i am concerned about creating duplicate content. Finally, does it make sense to include keywords in header and footer of the template? Will it have any positive or negative SEO impact?
On-Page Optimization | | petersen0 -
Filtered Navigation, Duplicate content issue on an Ecommerce Website
I have navigation that allows for multiple levels of filtering. What is the best way to prevent the search engine from seeing this duplicate content? Is it a big deal nowadays? I've read many articles and I'm not entirely clear on the solution. For example. You have a page that lists 12 products out of 100: companyname.com/productcategory/page1.htm And then you filter these products: companyname.com/productcategory/filters/page1.htm The filtered page may or may not contain items from the original page, but does contain items that are in the unfiltered navigation pages. How do you help the search engine determine where it should crawl and index the page that contains these products? I can't use rel=canonical, because the exact set of products on the filtered page may not be on any other unfiltered pages. What about robots.txt to block all the filtered pages? Will that also stop pagerank from flowing? What about the meta noindex tag on the filitered pages? I have also considered removing filters entirely, but I'm not sure if sacrificing usability is worth it in order to remove duplicate content. I've read a bunch of blogs and articles, seen the whiteboard special on faceted navigation, but I'm still not clear on how to deal with this issue.
On-Page Optimization | | 13375auc30