Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Free Duplicate Content Checker Tools ?
-
Hi Moz,
I am really looking for free tools which can carry my content duplication issue, as i visited http://moz.com/community/q/are-there-tools-to-discover-duplicate-content-issues-with-the-other-websites
suggested copyscape which is paid. I want FREE to handle my duplication issue.'
Thanks in Advance.
Best,
Teginder -
i was expecting this sort answer.
anyway thanks @eyepaq
-
Hi,
There is no tool better (or worst) then copyscape on the market free or payed. However copyscape is really really cheap - with a few bucks as credits you can go on for a long time.
The alternative is to do a small script that will check portions of the text in Google directly - but if you have a lot of queries you will get jammed (google won't work for your IP for some time - this after several capchea code requests though) by google - been there done that

Hope it helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Reusing content on different ccTLDs
We have a client with many international locations, each of which has their own ccTLD domain and website. Eg company-name.com, company-name.com.au, company-name.co.uk, company-name.fr, etc. Each domain/website only targets their own country, and the SEO aim is for each site to only rank well within their own country. We work for an individual country's operations, and the international head office wants to re-use our content on other countries' websites. While there would likely be some optimsation of the content for each region, there may be cases where it is re-used identically. We are concerned that this will cause duplicate content issues. I've read that the separate ccTLDs should indicate to search engines that content is aimed at the different locations - is this sufficient or should we be doing anything extra to avoid duplicate content penalties? Or should we argue that they simply must not do this at all and develop unique content for each? Thanks Julian
Content Development | | Bc.agency0 -
Should cornerstone content have 3,500 words? Does Google discern words from the main text and from the references?
Is it true that cornerstone content should have at least 3,500 words? I've done some research and found that the recommended amount is between 2K-10k. Also, the content that we create/publish has a lot of references/citations at the end of each article. Does Google discern words from the main text and from the references? Meaning should I count references as part of the word count? Thanks for the help!
Content Development | | kvillalobos0 -
Can We Publish Duplicate Content on Multi Regional Website / Blogs?
Today, I was reading Google's official article on Multi Regional website and use of duplicate content. Right now, We are working on 4 different blogs for following regions. And, We're writing unique content for each blog. But, I am thinking to use one content / subject for all 4 region blogs. USA: http://www.bannerbuzz.com/blog/ UK: http://www.bannerbuzz.co.uk/blog/ AUS: http://www.bannerbuzz.com.au/blog/ CA: http://www.bannerbuzz.ca/blog/ Let me give you very clear ideas on it. Recently, We have published one article on USA website. http://www.bannerbuzz.com/blog/choosing-the-right-banner-for-your-advertisement/ And, We want to publish this article / blog on UK, AUS & CA blog without making any changes. I have read following paragraph on Google's official guidelines and It's inspire me to make it happen. Which is best solution for it? Websites that provide content for different regions and in different languages sometimes create content that is the same or similar but available on different URLs. This is generally not a problem as long as the content is for different users in different countries. While we strongly recommend that you provide unique content for each different group of users, we understand that this may not always be possible. There is generally no need to "hide" the duplicates by disallowing crawling in a robots.txt file or by using a "noindex" robots meta tag. However, if you're providing the same content to the same users on different URLs (for instance, if both example.de/ and example.com/de/ show German language content for users in Germany), you should pick a preferred version and redirect (or use the rel=canonical link element) appropriately. In addition, you should follow the guidelines on rel-alternate-hreflang to make sure that the correct language or regional URL is served to searchers.
Content Development | | CommercePundit0 -
Duplicate Content
I have a service based client that is interested in optimizing his website for all the services that he provides in all the locations that he provides them in. For example: Service 1, location 1 Service 1, location 2 Service 2, location 1 Service 2, location 2 He wants to essentially create an individual page for each of the above, but i'm concerned that he will be penalized for duplicate content. Each of the pages would have the keyword in the url, page title and within the main body of content. We would certainly alter the content somewhat, but not sure how much a difference this would make. Any thoughts or advice would be greatly appreciated.
Content Development | | embracedarrenhughes1 -
Duplicate Content behind a Paywall
We have a website that is publicly visible. This website has content. We'd like to take that same content, put it on another website, behind a paywall. Since Google will not be able to crawl those pages behind the paywall is there any risk to ua doing this? Thanks! Mike
Content Development | | FOTF_DigitalMarketing0 -
Is there a way to repost content (with permission) to another site without being penalized by Google?
I write a monthly Social Media Marketing column for a local Business Journal and the column is printed in their paper as well as posted on their website. Is there any way I can repost these articles on my website's blog without being penalized by Google for "duplicate content"?
Content Development | | vyki0 -
How can I rank using translated content?
My friend has a website with similar content to mine, in a different language however. He has allowed me to translate his content if I link to it every post (can be nofollow). Does Google penalize me for clearly translated content? How can I make sure it ranks well? BTW, if I convince him that I don't link to him, is it better SEO-wise? Best,
Content Development | | kikocherman
Cherman0 -
Duplicate Terms of Use and Privacy Policy, is it a problem?
Hi, If i use same terms of use and privacy policy content across my websites, does it amounts to duplicate content issues? Does it affect my websites in any manner? Regards
Content Development | | IM_Learner0