Checking for content duplication against content on your own site.
-
We are currently trying to rewrite our product descriptions and I'm afraid some of the salespeople that are writing the descriptions are plagiarizing one-another's writing. Is there a content duplication checker that will allow you to check a piece of writing against a specific site rather than all of the web?
-
I assume that you have an admin section in the CMS where you are editing and entering these articles before they go live.
You need to get a developer to simply write a search algo that when you create a new article and before it goes live, it takes sections of your content and looks for matches/duplicates. You can set a requirement that it has to match on a minimum of a 4 to 5 word string and other such limitations to make sure you are not matching too many items. It will take a few tests to find a sweet spot of too many matches vs not enough.
With 17K pages, this is the only way you can really do this in an efficient way, you need some IT support/development. They may have to create a reporting layer as well to help you sift through the results.
Good luck.
-
I have two dev servers, one of which it is possible to do what you're talking about but that is the absolute least efficient tool to use for this.
The crawl diagnostics are updated about once a week which means I would have to post the new content and hope I got it online in time for the crawl. If I didn't then I would have to wait an additional week to see results.
The crawl diagnostics also limits the amount of pages it will crawl on your site to 10,000. I stated before that I have over 17,000 pages. So even if I did use this method, the chances of that page being crawled is little better than 50/50.
Also, the crawl diagnostics only tell you what pages have duplicate content - not the exact content that was duplicated. That means I'd have to manually find the page I'm targeting, then follow the supposed duplicate content suggestions proposed by the crawler and find the similarities myself.
I think it's very safe to say that the crawl diagnostics, nor any product that SEOmoz provides, is an answer to my issue. If I thought it was, I would have already been using it and would not have posted this question.
-
Hi Michael,
Having a website that big means that you might have a test or dev environment.
If not create one.
if you have something like test.yourwebsite.com and submit it to the SEOmoz tools as a new project you can see a report before your website goes live.
Cornel
-
Those are good answers and would work on a smaller scale site. We currently have over 17,000 product pages so I can't really use either method. It's looking like a google custom search is the best bet even though I can't search an entire paragraph at a time.
-
Just off the top of my head, there are a few low tech ways to do it....
If you have Win 7 the searching has improved greatly - just move all files to a local machine - and search the directory you placed in for the content you are wanting to check - it will give all files that contain the words. (but can become overloading)
If you have dreamweaver or other enterprise level editor - almost all have a site search function to where you can search/profile code/text and have it find one by one which pages contain the searched terms - or globally list them.
Other than that, probably a custom script -or a google search for an HTML profiler might help?
Shane
-
That's for pages that are already published and crawled. I want to able to search my site for entire sentences and/or paragraphs of text that I have yet to publish so I can make sure it's not being used elsewhere on the site. The crawl diagnostics tell me I have duplicate content after the fact - I'm trying to take a proactive approach rather than reactive.
-
The duplicate content from you website is shown in the SEOmoz tools.
Check the Crawl Diagnostics Summary:
Cornel
-
That site searches the entire web for copies. I'm looking for something to crawl my own site for duplicate content.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Consolidating a Large Site with Duplicate Content
I will be restructuring a large website for an OEM. They provide products & services for multiple industries, and the product/service offering is identical across all industries. I was looking at the site structure and ran a crawl test, and learned they have a LOT of duplicate content out there because of the way they set up their website. They have a page in the navigation for “solution”, aka what industry you are in. Once that is selected, you are taken to a landing page, and from there, given many options to explore products, read blogs, learn about the business, and contact them. The main navigation is removed. The URL structure is set up with folders, so no matter what you select after you go to your industry, the URL will be “domain.com/industry/next-page”. The product offerings, blogs available, and contact us pages do not vary by industry, so the content that can be found on “domain.com/industry-1/product-1” is identical to the content found on “domain.com/industry-2/product-1” and so-on and so-forth. This is a large site with a fair amount of traffic because it’s a pretty substantial OEM. Most of their content, however, is competing with itself because most of the pages on their website have duplicate content. I won’t begin my work until I can dive in to their GA and have more in-depth conversations with them about what kind of activity they’re tracking and why they set up the website this way. However, I don’t know how strategic they were in this set up and I don’t think they were aware that they had duplicate content. My first thought would be to work towards consolidating the way their site is set up, so we don’t spread the link-equity of “product-1” content, and direct all industries to one page, and track conversion paths a different way. However, I’ve never dealt with a site structure of this magnitude and don’t want to risk messing up their domain authority, missing redirect or URL mapping opportunities, or ruin the fact that their site is still performing well, even though multiple pages have the same content (most of which have high page authority and search visibility). I was curious if anyone has dealt with this before and if they have any recommendations for tackling something like this?
On-Page Optimization | | cassy_rich0 -
Is it better to create more pages of content or expand on current pages of content?
I am assuming that one way of improving the rankings of current pages will be to create more content on the keywords used... should this be an expansion of the content on current pages I am optimising for a keyword or is it better to keep creating new pages and if we are creating new pages is it best to use an extension of the keyword on the new page – for example if we are optimising one page for ‘does voltage optimisation work’ would it then be worth creating a page optimised for ‘does voltage optimisation work in hotels’ for example and so on? I am guessing maybe both might help, this is just a question I have had from one of my clients.
On-Page Optimization | | TWSI1 -
Product Attribute pages and Duplicate content
Hiya I have two queries is about a jewellery shop running on wordpress and woocommerce. 1. I am a little indecisive on how to index the product categories without creating duplicate pages which will get me into trouble. For example: All earrings are listed on the category page: chainsofgold.co.uk/buy/earrings/ We also have product attribute pages which lists all the subcategories for the earrings: chainsofgold.co.uk/earrings/creoles/
On-Page Optimization | | bongoheads
chainsofgold.co.uk/earrings/drop/
chainsofgold.co.uk/earrings/studs/ I have the category URL and the product attribute URLs set to be indexed on my sitemaps. Will this get me into trouble creating duplicate content with the main category page? Should I only have the main category indexed and "no-index, follow" all the product attribute pages? 2. I am also thinking about incorporating these product attribute URLS into my menu so when people hover over earrings they get shown the types of earrings they can buy. However, I have the woocommerce faceted navigation working on the category pages. So if someone is visiting the page chainsofgold.co.uk/buy/earrings/ The user can click on the left hand side, and select "drops". The URL they will get though is one which is not indexed: http://www.chainsofgold.co.uk/buy/earrings/?filter_earrings=123 Can I link to those product attribute pages without the risk of getting accused of creating duplicate content? Thank you for your help. Carolina0 -
Duplicate Content on Category Pages
Hi Everyone, I have a few category pages within a category for my eCommerce store and I've recently started writing a short description for each. However a lot of these paragraphs can be replicated for the same category. For instance '1 Inch thickness' I'll show all the information, and it'll be very similar to '2 inch thickness' but obviously one is 1 inch and one is 2 inch so I would only be changing one keyword and that is the thickness. I feel that this is helping customers because it has all the information in each category e.g. how to filter your choices. But it might be duplicate content. What would you recommend?
On-Page Optimization | | EcomLkwd0 -
Will google regards www.example.com and www.example.com?331457 as the duplicate content?
Our site has some affiliates, and the affiliate id is the suffix following with the url "?xxxxxx". I can see Google Analytics regards www.example.com and www.example.com?331457 as the different page, but in fact they are exactly the same, the version www.example.com?331457 is the visit from our affiliate site. And yesterday I start up my Moz Pro membership, and in the crawl issues I see SEOMoz thinks www.example.com and www.example.com?331457 are duplicate content. Is this really an issue? Will the search engine thinks these two pages are duplicate content?? Thanks you guys My first question here, not too dumb I hope. -----------------Update---------------------- I should explain how our affiliates work. We are an eBook related software company, and anyone can apply an affiliate account on the transaction platform "RegNow" even without our permission because we have opened the affiliate door. When a visitor come to our order page from an affiliate site, the url will add the affiliate ID suffix "?xxxxxx", and it's combined in cookies. After the deal is done, the affiliate gets his commission. So no matter how I customize the url with URL Builder, there must be the suffix "?xxxxxx". It's the ID of our affiliate, or they will get nothing. So the key point is, will the suffix "?331457" makes Google think www.example.com and www.example.com?331457 are different pages and duplicate content?
On-Page Optimization | | JonnyGreenwood0 -
High Volume Duplicate Title and Content Errors: Scale of 1-10 How bad is this?
I have 15k pages and 1.5K have duplicate title and content errors. the reason is I have a ring parent page with child pages for each of the size variations. On a scale of 1-10 how big an issue is this and does it need fixing?
On-Page Optimization | | Tippman0 -
Creating Duplicate Content on Shopping Sites
I have a client with an eCommerce site that is interested in adding their products to shopping sites. If we use the same information that is on the site currently, will we run into duplicate content issues when those same products & descriptions are published on shopping sites? Is it best practice to rewrite the product title and descriptions for shopping sites to avoid duplicate content issues?
On-Page Optimization | | mj7750 -
Duplicate content
Hi everybody, I am thrown into a SEO project of a website with a duplicate content problem because of a version with and a version without 'www' . The strange thing is that the version with www. has got more than 10 times more Backlings but is not in the organic index. Here are my questions: 1. Should I go on using the "without www" version as the primary resource? 2. Which kind of redirect is best for passing most of the link juice? Thanks in advance, Sebastian
On-Page Optimization | | Naturalmente0